Showing 38 Data jobs in Abu Dhabi
Master Data Management Lead
Posted today
Job Viewed
Job Description
Join us at Enquo, where we're dedicated to harnessing the transformative power of data and technology. As leaders in technology and data solutions, we prioritize humanity in everything we do. Our mission is clear: to empower organizations to unlock the full potential of their data through cutting-edge technology and exceptional services.
We envision a brighter future, where technology ignites extraordinary achievements and drives profound transformation. Here at Enquo, challenges are opportunities, and our passionate team thrives on making meaningful impacts on society. With humility and a collaborative spirit, we leverage teamwork and creative thinking to deliver optimal and trustworthy solutions.
Asa purpose-driven company, were passionate about using data and technology as catalysts for positive change. Our vision extends to a world where everyone can harness the power of data to reach their fullest potential. At Enquo, honesty, trust, and empathy form the foundation of our simple business language. Were agile, adaptable, and committed to bridging any business need with innovative data solutions.
Join our journey, where curiosity and entrepreneurship drive us to explore uncharted territories and create solutions that truly matter. We foster a collaborative and inclusive environment, valuing every team member's contributions. If you're a talented, curious, and creative individual who thrives in a fast-paced, dynamic setting, we invite you to be part of our mission. Together, let's create new opportunities through data and technology, shaping a more humane future for all.
Enquo: fueling a better future through innovation, data, and technology.
Role DescriptionThe Master Data Management Lead will be responsible for defining, designing, and building dimensional databases to meet business needs. Assisting in the application and implementation procedures of data standards and guidelines coding structures and data replication to ensure access to and integrity of data sets.
Key Responsibilities- Excellent experience in Master Data Management including include Meta-Data Management, Data Migration, Data Security and Data Transformation/Conversion
- Experience in ETL processes and advanced SQL skills.
- Intermediate Requirements Gathering/Elicitation, Documentation, and Source to Target mapping skills.
- Working knowledge of Conceptual, Logical and Physical Data Modeling concepts as well as Database design concepts
- Practical experience working in an Agile Methodology
- Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
- Proven experience in data quality management, with at least 8 years of experience in a leadership role.
- Strong understanding of data quality frameworks, tools, and methodologies.
- Proficiency in SQL and experience working with data profiling tools.
- Excellent analytical and problem-solving skills.
- Leadership and team management abilities.
- Effective communication and collaboration skills.
- Familiarity with data governance principles is a plus.
Is this job a match or a miss?
Data science trainer - Big Data Fundamentals
Posted today
Job Viewed
Job Description
We are looking for an experienced freelance trainer to conduct a 3-day workshop on Big Data Fundamentals for a professional audience in Abu Dhabi. The trainer should be able to deliver engaging, practical, and interactive sessions that combine theory with real-world applications.
Course OverviewThis workshop aims to introduce participants to the core principles of big data, its tools, technologies, and how data-driven insights can enhance decision-making and business strategy.
Indicative Learning Areas:
- Understanding big data concepts and architecture
- Overview of data processing frameworks (e.g., Hadoop, Spark)
- Data storage, management, and analytics fundamentals
- Practical applications and case studies in business and technology
The final course content and detailed outline are expected to be provided by the trainer.
Trainer Requirements- Qualification in Data Science, Computer Science, IT, or related field
- Minimum 5 years of professional or training experience in data analytics or big data technologies
- Strong communication and facilitation skills
- Must be available to deliver the training in person at Yas Island, Abu Dhabi
Interested trainers are invited to apply with the following:
- Updated CV or professional profile
- Relevant training experience
- Proposed hourly rate (inclusive of course content preparation, outline development, and transport costs)
Is this job a match or a miss?
Data Engineer
Posted today
Job Viewed
Job Description
About the Role
We are an emerging AI-native product-driven, agile start-up under Abu Dhabi government AND we are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms, pipelines, and ML enablement within a Databricks on Azure environment.
As part of a stream-aligned delivery team, you’ll work closely with Data Scientists, Architects, and Product Managers to build scalable, high-quality data solutions for clients. You'll be empowered by a collaborative environment that values continuous learning, Agile best practices, and technical excellence.
Ideal candidates have strong hands-on experience in Databricks, Python, ADF and are comfortable in fast-paced, client-facing consulting engagements.
Skills and Experience requirements
1. Technical
- Databricks (or similar) e.g. Notebooks (Python, SQL), Delta Lake, job scheduling, clusters, and workspace management, Unity Catalog, access control awareness
- Cloud data engineering – ideally Azure, including storage (e.g., ADLS, S3, ADLS), compute, and secrets management
- Development languages such as Python, SQL, C#, javascript etc. especially data ingestion, cleaning, and transformation
- ETL / ELT – including structured logging, error handling, reprocessing strategies, APIs, flat files, databases, message queues, event streaming, event sourcing etc.
- Automated testing (ideally TDD), pairing/mobbing. Trunk Based Development, Continuous Deployment and Infrastructure-as-Code (Terraform)
- Git and CI/CD for notebooks, data pipelines, and deployments
2. Integration & Data Handling
- Experienced in delivering platforms for clients – including file transfer, APIS (REST etc.), SQL/NoSQL/graph databases, JSON, CSV, XML, Parquet etc
- Data validation and profiling - assess incoming data quality. Cope with schema drift, deduplication, and reconciliation
- Testing and monitoring pipelines: Unit tests for transformations, data checks, and pipeline observability
3. Working Style
- Comfortable leveraging the best of lean, agile and waterfall approaches. Can contribute to planning, estimation, and documentation, but also collaborative daily re-prioritisation
- Able to explain technical decisions to teammates or clients
- Documents decisions and keeps stakeholders informed
- Comfortable seeking support from other teams for Product, Databricks, Data architecture
- Happy to collaborate with Data Science team on complex subsystems
Nice-to-haves
- MLflow or light MLOps experience (for the data science touchpoints)
- Dbt / dagster / airflow or similar transformation tools
- Understanding of security and compliance (esp. around client data)
- Past experience in consulting or client-facing roles
Candidate Requirements
- 5–8 years (minimum 3–4 years hands-on with cloud/data engineering, 1–2 years in Databricks/Azure, and team/project leadership exposure)
- Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, Information Systems, Data Engineering
Job Type: Full-time
BenefitsVisa, Insurance, Yearly Flight Ticket, Bonus scheme, relocation logistics covered
Interviewing process consists of 2 or 3 technical/behavioral interviews
#J-18808-LjbffrIs this job a match or a miss?
Data Engineer
Posted today
Job Viewed
Job Description
About the Role
We are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms, pipelines, and ML enablement within a Databricks on Azure environment.
As part of a stream-aligned delivery team, you’ll work closely with Data Scientists, Architects, and Product Managers to build scalable, high-quality data solutions for clients. You'll be empowered by a collaborative environment that values continuous learning, Agile best practices, and technical excellence.
Ideal candidates have strong hands-on experience in Databricks, Python, ADF and are comfortable in fast-paced, client-facing consultingengagements.
Skills and Experience requirements
- Databricks (or similar) e.g. Notebooks (Python, SQL), Delta Lake, job scheduling, clusters, and workspace management, Unity Catalog, access control awareness
- Cloud data engineering – ideally Azure, including storage (e.g., ADLS, S3, ADLS), compute, and secrets management
- Development languages such as Python, SQL, C#, javascript etc. especially data ingestion, cleaning, and transformation
- ETL / ELT – including structured logging, error handling, reprocessing strategies, APIs, flat files, databases, message queues, event streaming, event sourcing etc.
- Automated testing (ideally TDD), pairing/mobbing. Trunk Based Development, Continuous Deployment and Infrastructure-as-Code (Terraform)
- Git and CI/CD for notebooks, data pipelines, and deployments
2. Integration & Data Handling
- Experienced in delivering platforms for clients – including file transfer, APIS (REST etc.), SQL/NoSQL/graph databases, JSON, CSV, XML, Parquet etc
- Data validation and profiling - assess incoming data quality. Cope with schema drift, deduplication, and reconciliation
- Testing and monitoring pipelines: Unit tests for transformations, data checks, and pipeline observability
3. WorkingStyle
- Comfortable leveraging the best of lean, agile and waterfall approaches. Can contribute to planning, estimation, and documentation, but also collaborative daily re-prioritisation
- Able to explain technical decisions to teammates or clients
- Documents decisions and keeps stakeholders informed
- Comfortable seeking support from other teams for Product, Databricks, Data architecture
- Happy to collaborate with Data Science team on complex subsystems
Nice-to-haves
- MLflow or light MLOps experience (for the data science touchpoints)
- Dbt / dagster / airflow or similar transformation tools
- Understanding of security and compliance (esp. around client data)
- Past experience in consulting or client-facing roles
Candidate Requirements
- 5–8 years (minimum 3–4 years hands-on with cloud/data engineering, 1–2 years in Databricks/Azure, and team/project leadership exposure)
- Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, Information Systems, Data Engineering
Disclaimer:
This job posting is not open to recruitment agencies. Any candidate profile submitted by a recruitment agency will be considered as being received directly from an applicant. Contango reserves the rights to contact the candidate directly, without incurring any obligations or liabilities for payment of any fees to the recruitment agency.
#J-18808-LjbffrIs this job a match or a miss?
Senior Specialist - Data Science
Posted today
Job Viewed
Job Description
Press Tab to Move to Skip to Content Link
Select how often (in days) to receive an alert:
Embark on a journey where your unique contributions are celebrated, and your professional growth is embraced. At ADCB, we nurture a diverse, inclusive community where every voice is valued.
About the business area
GBS is a group of highly skilled and talented professionals who form an essential part of ADCB's continued journey of success. With a proud history of commitment, innovation and delivery, GBS constantly strives for excellence whilst ensuring the highest standards of quality and risk awareness. Each and every member of the GBS family plays an integral role in driving ADCB's strategy, growth and digital evolution by working closely with our valued business partners to achieve exceptional customer experience through our outstanding service and support.
We are actively seeking an ambitious professional to join our team at ADCB to work alongside passionate colleagues who share your ambition to redefine excellence in UAE banking.
In this role, your key responsibilities include:
- Develop and Implement advanced data science solutions to support the organization’s AI strategy
- Lead the design and execution of analytical models, ensuring data integrity, scalability, and relevance to business objectives
- Collaborate with cross-functional teams to translate complex data into actionable insights, and contribute to the structuring of AI-related roles and capabilities through evidence-based workforce planning and job design
- Drive the design and deployment of advanced Artificial Intelligence (AI)/Machine Learning (ML) models to optimize core banking operations, enhance digital customer journeys, and support strategic decision-making
- Build and maintain robust machine learning pipelines using Azure AI Foundry or AWS SageMaker, ensuring scalability, automation, and seamless integration with cloud-native services
- Develop and fine-tune generative AI models, including RAG and agentic workflows, leveraging cutting-edge techniques like Low-Rank Adaptation (LoRA), Parameter-efficient fine-tuning (PEFT), and multimodal architectures
- Design and execute complex experimental frameworks, including A/B testing, to derive actionable insights and validate model performance with statistical inference
- Work closely with software and data engineers to implement version-controlled, containerized (Docker) solutions, and Continuous Integration (CI)/Continuous Deployment (CD) pipelines for efficient model deployment
- Define and enforce documentation standards, coding practices, and reproducibility protocols to ensure consistency and quality across all data science projects
- Monitor and evaluate model performance using Machine Learning Operations (MLOps) and Large Language Models (LLMs) specific metrics (e.g., Bilingual Evaluation Understudy (BLEU), ROUGE), ensuring transparency, fairness, and continuous improvement of Artificial Intelligence (AI) systems
- Strong knowledge of banking operations and terminologies
- Analyses complex banking products and services for facilitating decision-making
- Adhere to all relevant organisational and departmental policies, processes, standard operating procedures and instructions so that work is carried out to the required standard and in a consistent manner while delivering the required standard of service to customers and stakeholders
- Manage self in line with the Bank’s people management policies, procedures, processes and practices to ensure adherence and to maximise own contribution to business performance
- Demonstrate Our Promise and apply the ADCB Service Standards to deliver the Bank’s required levels of service in all internal and external customer interactions
The ideal candidate should have the following experience:
- Minimum 5 years of experience in data science, with proven expertise in AI/ML, statistical modelling, and cloud-based deployments
- Strong background in banking processes, generative AI, and Machine Learning Operations (MLOps)
- Experience leading cross-functional teams and delivering scalable, production-grade solutions in a regulated, data-driven environment
- Bachelor’s degree in quantitative field such as Computer Science, Artificial Intelligence, Software Engineering, Statistics, Mathematics or related field required
- Master’s degree or equivalent in AI/ML is preferred
- N/A
- Advanced Python for AI/ML development and data manipulation
- Strong statistical expertise and experimental design (A/B testing, inference)
- Experience with ML models (transformers, RNNs), GenAI, and agentic workflows
- Skilled in fine-tuning LLMs using LoRA, PEFT, and evaluating with BLEU/ROUGE
- Proficient in building ML pipelines on Azure AI Foundry or AWS SageMaker
- Familiar with cloud services (Azure/AWS), serverless deployments, and MLOps tools
- Strong data visualization skills (Tableau, Power BI) with UX-focused design
- Effective collaboration with engineering teams and cross-functional stakeholders
- Knowledge of CI/CD, containerization, and documentation standards
- Active learner with awareness of emerging AI trends and technologies
What we offer:
Comprehensive Benefits Package: This includes market-leading medical insurance, group life and personal accident insurance, paid leave and leave airfare, employee preferential rates on loans and finance facilities, staff discounts and offers, and children education assistance (for certain job levels).
Flexible and Remote Working Options : We understand the importance of work-life balance and offer flexible working arrangements, subject to eligibility and job requirements.
Learning and Development Opportunities: We value and facilitate continuous learning and personal development, through a variety of exciting learning opportunities, such as structured instructor-led courses, a comprehensive e-Learning catalog, on-the-job training and professional development programs.
#J-18808-LjbffrIs this job a match or a miss?
Senior Data Engineer
Posted today
Job Viewed
Job Description
Job Title: Senior Data Engineer
Location: Abu Dhabi, UAE (Onsite)
Employment Type: Full-Time, Permanent
We are seeking a highly skilled Senior Data Engineer with 7–10+ years of experience to join our team in Abu Dhabi. The ideal candidate will have deep expertise in building and optimizing scalable data pipelines, working with large datasets, and leveraging modern cloud-based technologies. This is an onsite role offering the opportunity to work on cutting-edge data engineering initiatives, including data preparation for advanced analytics, ML, and GenAI use cases.
Key ResponsibilitiesDesign, build, and maintain scalable, reliable, and high-performance data pipelines.
Develop and optimize ETL/ELT processes to handle large-scale structured and unstructured data.
Implement data models and warehouse solutions with a focus on Snowflake.
Integrate and process data from diverse sources including SQL and NoSQL databases.
Ensure data quality, governance, and security standards are maintained across all systems.
Collaborate with data scientists and ML engineers to prepare data for advanced analytics and GenAI models.
Work closely with business stakeholders to translate requirements into data engineering solutions.
Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency.
Experience: 7–10+ years of professional experience in data engineering.
Programming: Proficiency in Python (mandatory).
Data Engineering: Strong SQL expertise; experience with NoSQL databases; advanced skills in data modeling, ETL/ELT processes, and orchestration tools (e.g., Airflow, cloud-native services).
Cloud: Hands-on experience with cloud data warehousing, data lakes, and related services (mandatory).
Snowflake: Proven experience designing and managing data pipelines/warehousing in Snowflake.
Big Data: Experience handling large volumes of structured and unstructured data.
Desired Skills: Exposure to preparing data for ML/GenAI workloads; familiarity with data governance and data quality frameworks.
Opportunity to work onsite in Abu Dhabi on high-impact data engineering projects.
Collaborative work environment with cross-functional teams in cloud, data, and ML domains.
Chance to contribute to the modernization of enterprise data platforms with cutting-edge technologies.
Is this job a match or a miss?
Data Engineer Lead
Posted today
Job Viewed
Job Description
Employment Type: Full time
Timing: General
Work Mode: Work from Office/Hybrid
Experience: 8+ Years
Location: Abu Dhabi
What Are We?
We are an exclusive cloud services company, transforming the world with digital technology. We excel in serving customers with experienced, certified professional cloud talent. We are passionate about creating world-class human capital with hands-on cloud experience to build, migrate, and support complex cloud environments. We provide tailored services and solutions for each customer and measure our success based on the business outcomes we deliver.
Who are We?
We are a group of passionate software professionals who love to work together and grow together. We create and inspire each other every day. We are a purpose-driven collaborative team, dedicated to achieving mutual success. Our corporate culture is unique, and each employee is valued. We are committed to going the extra mile to satisfy our customers. Leveraging our cutting-edge expertise, we provide the best solutions to clients in the areas of Cloud, Data, and IoT.
Role and Responsibilities:
- Talk to client stakeholders and understand the requirements for building their data warehouse / data lake / data lakehouse.
- Design, develop, and maintain data pipelines in Azure Data Factory (ADF) for ETL from on-premise and cloud-based sources.
- Design, develop, and maintain data warehouses and data lakes in Azure.
- Run large data platform and related programs to provide business intelligence support.
- Design and develop data models to support business intelligence solutions.
- Implement best practices in data modeling and data warehousing.
- Troubleshoot and resolve issues related to ETL and data connections.
Skills Required:
- Excellent written and verbal communication skills.
- Excellent knowledge and experience in ADF.
- Well-versed with ADLS Gen 2.
- Knowledge of SQL for data extraction and transformation.
- Ability to work with various data sources (Excel, SQL databases, APIs, etc.).
- Knowledge in SAS would be an added advantage.
- Knowledge in Power BI would be an added advantage.
Is this job a match or a miss?
Be The First To Know
About the latest Data Jobs in Abu Dhabi !
Senior Data Engineer
Posted today
Job Viewed
Job Description
Job Title: Senior Data Engineer
Location: Abu Dhabi, UAE (Onsite)
Employment Type: Full-Time, Permanent
We are seeking a highly skilled Senior Data Engineer with 7–10+ years of experience to join our team in Abu Dhabi. The ideal candidate will have deep expertise in building and optimizing scalable data pipelines, working with large datasets, and leveraging modern cloud-based technologies. This is an onsite role offering the opportunity to work on cutting-edge data engineering initiatives, including data preparation for advanced analytics, ML, and GenAI use cases.
Key ResponsibilitiesDesign, build, and maintain scalable, reliable, and high-performance data pipelines.
Develop and optimize ETL/ELT processes to handle large-scale structured and unstructured data.
Implement data models and warehouse solutions with a focus on Snowflake.
Integrate and process data from diverse sources including SQL and NoSQL databases.
Ensure data quality, governance, and security standards are maintained across all systems.
Collaborate with data scientists and ML engineers to prepare data for advanced analytics and GenAI models.
Work closely with business stakeholders to translate requirements into data engineering solutions.
Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency.
Experience: 7–10+ years of professional experience in data engineering.
Programming: Proficiency in Python (mandatory).
Data Engineering: Strong SQL expertise; experience with NoSQL databases; advanced skills in data modeling, ETL/ELT processes, and orchestration tools (e.g., Airflow, cloud-native services).
Cloud: Hands-on experience with cloud data warehousing, data lakes, and related services (mandatory).
Snowflake: Proven experience designing and managing data pipelines/warehousing in Snowflake.
Big Data: Experience handling large volumes of structured and unstructured data.
Desired Skills: Exposure to preparing data for ML/GenAI workloads; familiarity with data governance and data quality frameworks.
Opportunity to work onsite in Abu Dhabi on high-impact data engineering projects.
Collaborative work environment with cross-functional teams in cloud, data, and ML domains.
Chance to contribute to the modernization of enterprise data platforms with cutting-edge technologies.
Is this job a match or a miss?
Senior Data Engineer
Posted today
Job Viewed
Job Description
Job Title: Senior Data Engineer
Location: Abu Dhabi, UAE (Onsite)
Employment Type: Full-Time, Permanent
We are seeking a highly skilled Senior Data Engineer with 7–10+ years of experience to join our team in Abu Dhabi. The ideal candidate will have deep expertise in building and optimizing scalable data pipelines, working with large datasets, and leveraging modern cloud-based technologies. This is an onsite role offering the opportunity to work on cutting-edge data engineering initiatives, including data preparation for advanced analytics, ML, and GenAI use cases.
Key ResponsibilitiesDesign, build, and maintain scalable, reliable, and high-performance data pipelines.
Develop and optimize ETL/ELT processes to handle large-scale structured and unstructured data.
Implement data models and warehouse solutions with a focus on Snowflake.
Integrate and process data from diverse sources including SQL and NoSQL databases.
Ensure data quality, governance, and security standards are maintained across all systems.
Collaborate with data scientists and ML engineers to prepare data for advanced analytics and GenAI models.
Work closely with business stakeholders to translate requirements into data engineering solutions.
Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency.
Experience: 7–10+ years of professional experience in data engineering.
Programming: Proficiency in Python (mandatory).
Data Engineering: Strong SQL expertise; experience with NoSQL databases; advanced skills in data modeling, ETL/ELT processes, and orchestration tools (e.g., Airflow, cloud-native services).
Cloud: Hands-on experience with cloud data warehousing, data lakes, and related services (mandatory).
Snowflake: Proven experience designing and managing data pipelines/warehousing in Snowflake.
Big Data: Experience handling large volumes of structured and unstructured data.
Desired Skills: Exposure to preparing data for ML/GenAI workloads; familiarity with data governance and data quality frameworks.
Opportunity to work onsite in Abu Dhabi on high-impact data engineering projects.
Collaborative work environment with cross-functional teams in cloud, data, and ML domains.
Chance to contribute to the modernization of enterprise data platforms with cutting-edge technologies.
Is this job a match or a miss?
Quantitative Researcher & Developer - Data Science
Posted today
Job Viewed
Job Description
Overview
We want to crack the hardest challenges in systematic investing.
ADIA’s Quantitative Research & Development team works in a multidisciplinary environment, where experts with deep specializations pioneer new ways to think about investing, and put these ideas into practice.
Our team leverages data to enable the research, development and implementation of quantitative investment strategies by providing comprehensive, clean, and actionable data as well as building robust, efficient, and scalable technology.
Key Responsibilities- Data Ingestion & Cleansing : Develop pipelines to clean, tag and integrate new data sources.
- Data Analysis : Generate descriptive statistics, uncover valuable patterns and present potential applications of datasets.
- Data Infrastructure : Maintain automated systems for data collection, cleaning and retrieval.
- Education : Ph.D. or Masters in a quantitative discipline or equivalent experience.
- Experience : 3+ years in data science
- Technical Skills : Strong proficiency in SQL and Python, as well as proficiency in at least one other language. Cloud experience and familiarity with financial data sets are a plus.
- Analytical Skills : Experience with applying analytical and statistical techniques to large real-world datasets.
- Problem Solving & Communication : Ability to solve complex problems and communicate findings to technical and non-technical stakeholders.
- Passion for Data : Curiosity and desire to learn new methods and technologies.
Is this job a match or a miss?
Explore abundant data jobs in