113 Data Managers jobs in Dubai
Database Administration Job Opening
Posted today
Job Viewed
Job Description
We are seeking a skilled Database Administrator to join our team. In this role, you will be responsible for the installation, configuration, and maintenance of Oracle databases.
Your primary focus will be on ensuring database performance, identifying bottlenecks, and implementing solutions to optimize system resources.
Responsibilities:- Install and configure Oracle databases to meet business requirements
- Monitor database performance and identify areas for improvement
- Implement database security best practices, including user roles and privileges
- Collaborate with development teams to ensure seamless integration between databases and applications
- Participate in routine database maintenance tasks, such as upgrades and patching
- Oracle Certified Associate (OCA) certification or equivalent experience
- Strong understanding of relational database concepts, SQL, and PL/SQL
- Knowledge of Oracle RAC, Grid, ASM, RMAN, and Dataguard
- Experience with database backup and recovery procedures
- Basic knowledge of Linux OS and database monitoring tools
- Opportunity to work with a dynamic team
- Professional growth and development opportunities
- Competitive compensation package
Data Quality Analyst
Posted today
Job Viewed
Job Description
Data Quality Analyst role at Capgemini. Senior Data Quality Analyst – Capgemini.
About Capgemini: Capgemini is a global leader in consulting, digital transformation, technology, and engineering services with a presence in over 50 countries. Our collaborative, people-centric culture partners with clients across industries.
Role OverviewWe are seeking a highly skilled Senior Data Quality Analyst with a robust background in designing, implementing, and maintaining data quality frameworks leveraging Python or Collibra. The ideal candidate will ensure data accuracy, consistency, completeness, and reliability across large-scale cloud-based platforms, especially within Azure Databricks environments. This role requires expertise in automated data quality assurance, a deep understanding of data governance, and hands-on experience integrating quality controls into modern data pipelines.
The Senior Data Quality Analyst will be embedded within an agile squad dedicated to a specific business mission while contributing to a broader program comprising 4 to 8 interconnected squads. Collaboration, technical leadership, and a continuous improvement mindset are essential as you work cross-functionally to elevate the organization's data quality standards.
Key Responsibilities- Development & Integration
- Design, develop, and implement automated data quality checks using Python scripts and libraries or Collibra Data Quality components.
- Integrate data quality validation logic within existing ETL/ELT pipelines operating on Azure Databricks, ensuring quality gates are consistently enforced across all data flows.
- Develop and maintain reusable Python modules that perform anomaly detection, schema validation, and rule-based data quality checks to enable rapid scaling of quality coverage.
- Collaborate with data engineering teams to embed continuous quality controls throughout the data ingestion, transformation, and consumption lifecycle.
- Support the deployment and management of Collibra-based data quality solutions to automate governance workflows and stewardship activities.
- Data Quality Management
- Define, measure, and rigorously enforce data quality metrics, thresholds, and Service Level Agreements (SLAs) tailored to business-critical datasets.
- Utilize Collibra to manage and operationalize data governance workflows, maintain business glossaries, and delineate stewardship responsibilities.
- Monitor the integrity of data pipelines for completeness, accuracy, timeliness, and consistency across distributed and cloud-native environments.
- Conduct detailed root cause analyses for complex data quality issues, collaborating with engineers and domain experts to drive permanent remediation and prevention strategies.
- Implement and continuously refine monitoring frameworks, utilizing dashboards and alerting systems (built using Python and Collibra integrations) for real-time visibility into key data quality indicators.
- Support & Operations
- Act as a Level 2/3 escalation point for data quality incidents, troubleshooting issues and coordinating with other agile squads and technical teams for rapid resolution.
- Work closely with product owners, business analysts, and key stakeholders to understand evolving data requirements and ensure quality expectations are aligned and met.
- Maintain and optimize operational dashboards for ongoing data quality monitoring, leveraging both Python-based and Collibra-integrated solutions.
- Participate actively in agile ceremonies, including sprint planning, daily standups, reviews, and retrospectives, contributing to squad goals and continuous delivery improvements.
- Governance & Best Practices
- Establish, document, and evangelize data quality standards, validation frameworks, and best practices across squads and the broader data organization.
- Maintain comprehensive documentation on validation rules, automated test cases, and quality assurance procedures, ensuring transparency and repeatability.
- Mentor, coach, and upskill junior data engineers and analysts in data quality concepts, tools, and processes to foster a quality-first culture.
- Ensure strict compliance with data governance, privacy, and security policies by leveraging Collibra's governance and stewardship frameworks.
- Continuously assess emerging technologies, tools, and methodologies for potential enhancement of the data quality ecosystem.
- Bachelor's or Master's degree in Computer Science, Data Management, Information Systems, or a closely related field.
- Progressive experience in data quality engineering, data management, or related data roles within complex technology environments.
- Demonstrable expertise in Python, including the development of reusable data quality and validation libraries.
- Extensive hands-on experience with Azure Databricks, including cloud-native data processing, ETL/ELT orchestration, and distributed computing concepts.
- Proficiency with Collibra Data Quality platform or equivalent data governance and stewardship tools.
- Strong track record working in agile environments, participating in cross-functional teams, and adapting to rapidly evolving project requirements.
- Excellent analytical, problem-solving, and communication skills, with the ability to convey complex technical topics to both technical and non-technical audiences.
- Databricks Certified Data Engineer Associate or Professional
- Python Institute Certifications (PCAP, PCPP)
- Collibra Ranger or Collibra Data Quality Steward Certifications
- Deep understanding of data quality frameworks, methodologies, and industry best practices
- Hands-on experience building automated data quality tests using Python, PySpark, or similar open-source libraries
- Expertise in designing quality validation steps within ETL/ELT data pipelines for large volumes of structured and semi-structured data
- Familiarity with cloud data ecosystems, especially Azure and Databricks
- Proven ability to operationalize and scale data governance using Collibra or comparable tools
- Experience with dashboarding, data visualization, and monitoring tools for real-time data quality tracking
- Strong collaboration, leadership, and mentoring abilities within agile squads or matrix teams
- Knowledge of data privacy, security, and regulatory compliance requirements
- Ability to drive innovation and continuous improvement in data quality processes
- Opportunity to work on cutting-edge data platforms and technologies in a global, multicultural environment
- Collaborative and agile work culture with empowering career growth opportunities
- Competitive remuneration, benefits, and professional certification support
- Access to Capgemini's global learning platforms, mentorship programs, and technology communities
- Exposure to high-impact projects with Fortune 500 clients
#J-18808-Ljbffr
Data Quality Analyst
Posted today
Job Viewed
Job Description
Senior Data Quality Analyst – Capgemini
About Capgemini
Capgemini is a global leader in consulting, digital transformation, technology, and engineering services. With a presence in over 50 countries and a strong heritage of innovation, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Our collaborative approach and a people-centric work culture have made us a partner of choice for clients across industries.
Role Overview
We are seeking a highly skilled Senior Data Quality Analyst with a robust background in designing, implementing, and maintaining data quality frameworks leveraging Python or Collibra. The ideal candidate will be adept at ensuring data accuracy, consistency, completeness, and reliability across large-scale cloud-based platforms, especially within Azure Databricks environments. This role requires expertise in automated data quality assurance, a deep understanding of data governance, and hands-on experience integrating quality controls into modern data pipelines.
The Senior Data Quality Analyst will be embedded within an agile squad dedicated to a specific business mission while contributing to a broader program comprising 4 to 8 interconnected squads. Collaboration, technical leadership, and a continuous improvement mindset are essential as you work cross-functionally to elevate the organization's data quality standards.
Key Responsibilities
1. Development & Integration
- Design, develop, and implement automated data quality checks using Python scripts and libraries or Collibra Data Quality components.
- Integrate data quality validation logic within existing ETL/ELT pipelines operating on Azure Databricks, ensuring quality gates are consistently enforced across all data flows.
- Develop and maintain reusable Python modules that perform anomaly detection, schema validation, and rule-based data quality checks to enable rapid scaling of quality coverage.
- Collaborate with data engineering teams to embed continuous quality controls throughout the data ingestion, transformation, and consumption lifecycle.
- Support the deployment and management of Collibra-based data quality solutions to automate governance workflows and stewardship activities.
2. Data Quality Management
- Define, measure, and rigorously enforce data quality metrics, thresholds, and Service Level Agreements (SLAs) tailored to business-critical datasets.
- Utilize Collibra to manage and operationalize data governance workflows, maintain business glossaries, and delineate stewardship responsibilities.
- Monitor the integrity of data pipelines for completeness, accuracy, timeliness, and consistency across distributed and cloud-native environments.
- Conduct detailed root cause analyses for complex data quality issues, collaborating with engineers and domain experts to drive permanent remediation and prevention strategies.
- Implement and continuously refine monitoring frameworks, utilizing dashboards and alerting systems (built using Python and Collibra integrations) for real-time visibility into key data quality indicators.
3. Support & Operations
- Act as a Level 2/3 escalation point for data quality incidents, troubleshooting issues and coordinating with other agile squads and technical teams for rapid resolution.
- Work closely with product owners, business analysts, and key stakeholders to understand evolving data requirements and ensure quality expectations are aligned and met.
- Maintain and optimize operational dashboards for ongoing data quality monitoring, leveraging both Python-based and Collibra-integrated solutions.
- Participate actively in agile ceremonies, including sprint planning, daily standups, reviews, and retrospectives, contributing to squad goals and continuous delivery improvements.
4. Governance & Best Practices
- Establish, document, and evangelize data quality standards, validation frameworks, and best practices across squads and the broader data organization.
- Maintain comprehensive documentation on validation rules, automated test cases, and quality assurance procedures, ensuring transparency and repeatability.
- Mentor, coach, and upskill junior data engineers and analysts in data quality concepts, tools, and processes to foster a quality-first culture.
- Ensure strict compliance with data governance, privacy, and security policies by leveraging Collibra's governance and stewardship frameworks.
- Continuously assess emerging technologies, tools, and methodologies for potential enhancement of the data quality ecosystem.
Qualifications
- Bachelor's or Master's degree in Computer Science, Data Management, Information Systems, or a closely related field.
- Years of progressive experience in data quality engineering, data management, or related data roles within complex technology environments.
- Demonstrable expertise in Python, including the development of reusable data quality and validation libraries.
- Extensive hands-on experience with Azure Databricks, including cloud-native data processing, ETL/ELT orchestration, and distributed computing concepts.
- Proficiency with Collibra Data Quality platform or equivalent data governance and stewardship tools.
- Strong track record working in agile environments, participating in cross-functional teams, and adapting to rapidly evolving project requirements.
- Excellent analytical, problem-solving, and communication skills, with the ability to convey complex technical topics to both technical and non-technical audiences.
Preferred Certifications (One or More)
- Databricks Certified Data Engineer Associate or Professional
- Microsoft Certified: Azure Data Engineer Associate
- Python Institute Certifications (PCAP, PCPP)
- Collibra Ranger or Collibra Data Quality Steward Certifications
Key Skills & Competencies
- Deep understanding of data quality frameworks, methodologies, and industry best practices
- Hands-on experience building automated data quality tests using Python, PySpark, or similar open-source libraries
- Expertise in designing quality validation steps within ETL/ELT data pipelines for large volumes of structured and semi-structured data
- Familiarity with cloud data ecosystems, especially Azure and Databricks
- Proven ability to operationalize and scale data governance using Collibra or comparable tools
- Experience with dashboarding, data visualization, and monitoring tools for real-time data quality tracking
- Strong collaboration, leadership, and mentoring abilities within agile squads or matrix teams
- Knowledge of data privacy, security, and regulatory compliance requirements
- Ability to drive innovation and continuous improvement in data quality processes
What We Offer
- Opportunity to work on cutting-edge data platforms and technologies in a global, multicultural environment
- Collaborative and agile work culture with empowering career growth opportunities
- Competitive remuneration, benefits, and professional certification support
- Access to Capgemini's global learning platforms, mentorship programs, and technology communities
- Exposure to high-impact projects with Fortune 500 clients
Data Quality Specialist
Posted today
Job Viewed
Job Description
Job Title: Data Quality Specialist
About the Role:
We are seeking a skilled professional to join our team as a Data Quality Specialist. This role is responsible for ensuring the accuracy, consistency, and completeness of data across large-scale cloud-based platforms.
Key Responsibilities:
- Data Quality Engineering
Design, develop, and implement automated data quality checks using Python scripts and libraries or Collibra Data Quality components.
Integrate data quality validation logic within existing ETL/ELT pipelines operating on Azure Databricks, ensuring quality gates are consistently enforced across all data flows.
Develop and maintain reusable Python modules that perform anomaly detection, schema validation, and rule-based data quality checks to enable rapid scaling of quality coverage.
Collaborate with data engineering teams to embed continuous quality controls throughout the data ingestion, transformation, and consumption lifecycle.
Support the deployment and management of Collibra-based data quality solutions to automate governance workflows and stewardship activities.
Data Quality Management
Define, measure, and rigorously enforce data quality metrics, thresholds, and Service Level Agreements (SLAs) tailored to business-critical datasets.
Utilize Collibra to manage and operationalize data governance workflows, maintain business glossaries, and delineate stewardship responsibilities.
Monitor the integrity of data pipelines for completeness, accuracy, timeliness, and consistency across distributed and cloud-native environments.
Conduct detailed root cause analyses for complex data quality issues, collaborating with engineers and domain experts to drive permanent remediation and prevention strategies.
Implement and continuously refine monitoring frameworks, utilizing dashboards and alerting systems (built using Python and Collibra integrations) for real-time visibility into key data quality indicators.
Required Skills and Qualifications
- Education: Bachelor's or Master's degree in Computer Science, Data Management, Information Systems, or a closely related field.
- Experience: Years of progressive experience in data quality engineering, data management, or related data roles within complex technology environments.
- Technical Skills: Demonstrable expertise in Python, including the development of reusable data quality and validation libraries. Extensive hands-on experience with Azure Databricks, including cloud-native data processing, ETL/ELT orchestration, and distributed computing concepts.
- Collibra: Proficiency with Collibra Data Quality platform or equivalent data governance and stewardship tools.
- Azure Databricks: Strong track record working in agile environments, participating in cross-functional teams, and adapting to rapidly evolving project requirements.
- Problem-Solving and Communication: Excellent analytical, problem-solving, and communication skills, with the ability to convey complex technical topics to both technical and non-technical audiences.
Benefits
- Opportunity to work on cutting-edge data platforms and technologies in a global, multicultural environment.
- Collaborative and agile work culture with empowering career growth opportunities.
- Competitive remuneration, benefits, and professional certification support.
- Access to Capgemini's global learning platforms, mentorship programs, and technology communities.
- Exposure to high-impact projects with Fortune 500 clients.
Preferred Certifications
- Databricks Certified Data Engineer Associate or Professional.
- Microsoft Certified: Azure Data Engineer Associate.
- Python Institute Certifications (PCAP, PCPP).
- Collibra Ranger or Collibra Data Quality Steward Certifications.
Key Skills and Competencies
- Deep understanding of data quality frameworks, methodologies, and industry best practices.
- Hands-on experience building automated data quality tests using Python, PySpark, or similar open-source libraries.
- Expertise in designing quality validation steps within ETL/ELT data pipelines for large volumes of structured and semi-structured data.
- Familiarity with cloud data ecosystems, especially Azure and Databricks.
- Proven ability to operationalize and scale data governance using Collibra or comparable tools.
- Experience with dashboarding, data visualization, and monitoring tools for real-time data quality tracking.
- Strong collaboration, leadership, and mentoring abilities within agile squads or matrix teams.
- Knowledge of data privacy, security, and regulatory compliance requirements.
- Ability to drive innovation and continuous improvement in data quality processes.
Data Quality Analyst
Posted today
Job Viewed
Job Description
The Dubai Health Authority (DHA) oversees the health sector in Dubai and aims to provide an accessible, effective, and integrated healthcare system, in line with the Dubai Strategic Plan 2026. The DHA operates healthcare facilities including hospitals, specialty centers, and primary health centers throughout Dubai, with a focus on quality, efficiency, patients, and staff satisfaction.
Role DescriptionThis is a full-time on-site role for a Data Quality Analyst at Dubai Health Authority. The Data Quality Analyst will be responsible for data quality, data governance, analytical skills, data management, and data analytics tasks to ensure the accuracy and reliability of healthcare data.
Qualifications- Data Quality, Data Governance, and Data Management skills
- Analytical Skills and Data Analytics expertise
- Experience in healthcare data analysis and management
- Strong attention to detail and problem-solving abilities
- Excellent communication and teamwork skills
- Bachelor's degree in Data Science, Computer Science, Information Management, or a related field
- Entry level
- Full-time
- Information Technology
- Hospitals and Health Care
#J-18808-Ljbffr
Senior Data Quality Specialist
Posted today
Job Viewed
Job Description
We are seeking a highly skilled professional to lead our data quality initiatives.
The ideal candidate will design, implement, and maintain robust data quality frameworks leveraging Python or Collibra.
This role requires expertise in automated data quality assurance, a deep understanding of data governance, and hands-on experience integrating quality controls into modern data pipelines.
The selected individual will be embedded within an agile squad dedicated to a specific business mission while contributing to a broader program comprising multiple interconnected squads.
Key Responsibilities- Data Quality Engineering
- Design and develop automated data quality checks using Python scripts and libraries or Collibra Data Quality components.
- Integrate data quality validation logic within existing ETL/ELT pipelines operating on Azure Databricks, ensuring quality gates are consistently enforced across all data flows.
- Develop and maintain reusable Python modules that perform anomaly detection, schema validation, and rule-based data quality checks to enable rapid scaling of quality coverage.
- Data Governance and Stewardship
- Define, measure, and rigorously enforce data quality metrics, thresholds, and Service Level Agreements (SLAs) tailored to business-critical datasets.
- Utilize Collibra to manage and operationalize data governance workflows, maintain business glossaries, and delineate stewardship responsibilities.
- Monitoring and Improvement
- Monitor the integrity of data pipelines for completeness, accuracy, timeliness, and consistency across distributed and cloud-native environments.
- Conduct detailed root cause analyses for complex data quality issues, collaborating with engineers and domain experts to drive permanent remediation and prevention strategies.
The successful candidate will have a strong background in data quality engineering, excellent analytical and problem-solving skills, and the ability to communicate complex technical topics to both technical and non-technical audiences.
A bachelor's or master's degree in Computer Science, Data Management, Information Systems, or a closely related field is required. Additional certifications such as Databricks Certified Data Engineer Associate or Professional are a plus.
About UsWe offer a collaborative and agile work culture with opportunities for career growth and development.
Our team works on cutting-edge data platforms and technologies in a global, multicultural environment.
We provide competitive remuneration, benefits, and support for professional certifications.
Join us in shaping the future of data quality and governance.
Data Quality Improvement Specialist
Posted today
Job Viewed
Job Description
The primary objective of this role is to oversee the monitoring and enhancement of global Service Line platform data quality consistency and completeness. This involves collaborating with markets to manage and analyse SL related master data information such as clients, products, campaigns and suppliers.
Globally and regionally, identifying, measuring, governing and reporting on data quality that supports our Service Line Operations team to drive actionable improvement in conjunction with Market SL COOs, peers and team members. The goal is to cleanse, govern and enhance data quality.
Supporting the organization to build data quality awareness, considering consistency, completeness, accuracy, auditability and timeliness of the data quality to highlight a few key metrics.
Key Responsibilities:
Develops a deep understanding of the Service Line technology landscape including the data flows involving local Service Line systems, D365, other corporate systems and global onboarding solutions to be able to critically assess the operational and business change required to realise the value from our investments.
Conducts periodic reviews with our market and functional teams to identify inaccurate or legacy data and takes the required actions to deactivate/eliminate/rectify master records so as to avoid financial implications.
Works with Global MDM team to share insights/inputs based on the discussions and analysis around master data collected at local/regional level.
Supports data entry and input into our Service Line platform to ensure and appropriately QA data quality from the start.
Requirements:
Experience as a Manager
Be The First To Know
About the latest Data managers Jobs in Dubai !
Senior Data Quality Engineer
Posted today
Job Viewed
Job Description
Job Summary:
We are seeking a skilled Senior QA Tester to join our team. The ideal candidate will have 8+ years of experience in finance-related projects and functional knowledge of Revenue Accounting and Oracle ERP Finance Modules.
The successful candidate will possess hands-on experience in Database testing / Data warehouse testing / data, preferably in Oracle DB, with a strong understanding of ETL testing and processes like test strategy, test cases, Defect triages.
In this role, you will analyze ETL mapping documents, develop and execute SQL scripts based on these documents, verify data correctness and integrity, and ensure seamless data flow.
You will also contribute to QA processes and provide expert-level support to ensure the quality of our data sources, extracts, transformations, and loads.
Key Responsibilities:
- Analyze ETL mapping documents to identify data flows and relationships
- Develop and execute SQL scripts to verify data correctness and integrity
- Work closely with cross-functional teams to ensure smooth data integration
- Provide expert-level support to resolve data quality issues
Requirements:
- 8+ years of experience in finance-related projects
- Functional knowledge of Revenue Accounting and Oracle ERP Finance Modules
- Hands-on experience in Database testing / Data warehouse testing / data, preferably in Oracle DB
- Strong understanding of ETL testing and processes like test strategy, test cases, Defect triages
Finance Data Quality Assurance Specialist
Posted today
Job Viewed
Job Description
We are seeking a highly skilled Senior QA Tester with 8+ years of experience in finance-related projects to join our team.
This role requires functional expertise in Revenue Accounting and Oracle ERP Finance Modules, as well as hands-on experience in Database testing / Data warehouse testing / data, preferably in Oracle DB.
The ideal candidate will possess skills to validate data sources, extract data, apply transformation logic, and load data into target tables and verify data in Reports and Dashboards.
The selected individual will have experience in analyzing ETL mapping documents and developing and executing SQL scripts based on ETL mapping documents. They must be able to verify the data correctness and integrity.
A good understanding of QA processes like preparation of test strategy, test cases, Defect triages etc. based on ETL mapping documents is essential.
Key Responsibilities:
- Validate data sources, extract data, and apply transformation logic
- Load data into target tables and verify data in Reports and Dashboards
- Analyze ETL mapping documents and develop SQL scripts
- Verify data correctness and integrity
- Participate in QA processes like test strategy, test cases, and Defect triages
Requirements:
- 8+ years of experience in finance-related projects
- Functional expertise in Revenue Accounting and Oracle ERP Finance Modules
- Hands-on experience in Database testing / Data warehouse testing / data
- Experience in ETL testing
- Good understanding of QA processes
Global Sales Data Quality Specialist
Posted today
Job Viewed
Job Description
Sales Operations Role
As a Sales Operations Analyst, you will be part of a team that drives continuous process and data quality improvement to ensure orders are processed correctly.
The success of this role is measured against increasing the velocity of deals and the quality of Customer Relationship Management (CRM) data used to drive key business decisions.
We are looking for someone with exceptional analytical skills who can work productively with stakeholders at all levels of the organization on a global scale.
This role entails:
- Supporting all aspects of day-to-day operations of the sales function globally
- Reviewing opportunities for accuracy and provision orders on successful closure of a deal
- Supporting the Sales Operations Manager to improve sales data quality and processes
- Reporting on data quality metrics to executive-level stakeholders
- Implementing surveys and defining metrics for executive-level stakeholders
- Supporting the sales team globally on CRM and other sales operations systems
- Delivering sales metrics, dashboards, and other ad-hoc analytical tasks
What we offer:
- Distributed work environment with twice-yearly team sprints in person
- Personal learning and development budget
- Annual compensation review
- Recognition rewards
- Annual holiday leave
- Maternity and paternity leave
- Employee Assistance Programme
- Opportunity to travel to new locations to meet colleagues
Key requirements include experience in a sales operations/sales order processing role, experience driving data quality improvement, and excellent problem-solving skills.