239 Data Engineer jobs in the United Arab Emirates

Data Engineer

Dubai, Dubai Teliolabs Communication Private Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Location : Dubai

Who Can Apply: Candidates who are currently in Dubai
Job Type: Contract
Experience: Minimum 8+ years

Job Summary:

We are looking for an experienced Data Engineer to design, develop, and optimize data pipelines, ETL processes, and data integration solutions. The ideal candidate should have expertise in AWS cloud services, data engineering best practices, open-source tools, and data schema design. The role requires hands-on experience with large-scale data processing, real-time data streaming, and cloud-based data architectures.

Key Responsibilities:

  • Develop and Maintain Data Pipelines to process structured and unstructured data efficiently.
  • Implement ETL/ELT Workflows for batch and real-time data processing.
  • Optimize Data Processing Workflows using distributed computing frameworks.
  • Ensure Data Integrity and Quality through data validation, cleaning, and transformation techniques.
  • Work with AWS Cloud Services , including S3, Redshift, Glue, Lambda, DynamoDB, and Kinesis.
  • Leverage Open-Source Tools like Apache Spark, Airflow, Kafka, and Flink for data processing.
  • Manage and Optimize Database Performance for both SQL and NoSQL environments.
  • Collaborate with Data Scientists and Analysts to enable AI/ML model deployment and data accessibility.
  • Support Data Migration Initiatives from on-premise to cloud-based data platforms.
  • Ensure Compliance and Security Standards in handling sensitive and regulated data.
  • Develop Data Models and Schemas for efficient storage and retrieval.

Required Skills & Qualifications:

  • 8+ years of experience in data engineering, data architecture, and cloud computing.
  • Strong knowledge of AWS Services such as Glue, Redshift, Athena, Lambda, and S3.
  • Expertise in ETL Tools , including Talend, Apache NiFi, Informatica, dbt, and AWS Glue.
  • Proficiency in Open-Source Tools such as Apache Spark, Hadoop, Airflow, Kafka, and Flink.
  • Strong Programming Skills in Python, SQL, and Scala.
  • Experience in Data Schema Design , normalization, and performance optimization.
  • Knowledge of Real-time Data Streaming using Kafka, Kinesis, or Apache Flink.
  • Experience in Data Warehouse and Data Lake Solutions .
  • Hands-on experience with DevOps and CI/CD Pipelines for data engineering workflows.
  • Understanding of AI and Machine Learning Data Pipelines .
  • Strong analytical and problem-solving skills .

Preferred Qualifications:

  • AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification.
  • Experience with Kubernetes, Docker, and serverless data processing.
  • Exposure to MLOps and data engineering practices for AI/ML solutions.
  • Experience with distributed computing and big data frameworks.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Everythinginclick

Posted today

Job Viewed

Tap Again To Close

Job Description

The Data Engineer will be responsible for developing semantic models on top of the Data Lake/Data Warehouse to fulfill the self-service BI foundation requirements. This includes data extraction from various data sources and integration into the central data lake/data warehouse using enterprise platforms like Informatica iPaaS.

Key Responsibilities of Data Engineer
  1. Designing data warehouse data models based on business requirements.
  2. Designing, developing, and testing both batch and real-time Extract, Transform and Load (ETL) processes required for data integration.
  3. Ingesting both structured and unstructured data into the SMBU data lake/data warehouse system.
  4. Designing and developing semantic models/self-service cubes.
  5. Performing BI administration and access management to ensure access and reports are properly governed.
  6. Performing unit testing and data validation to ensure business UAT is successful.
  7. Performing ad-hoc data analysis and presenting results in a clear manner.
  8. Assessing data quality of the source systems and proposing enhancements to achieve a satisfactory level of data accuracy.
  9. Optimizing ETL processes to ensure execution time meets requirements.
  10. Maintaining and architecting ETL pipelines to ensure data is loaded on time on a regular basis.
Qualification Required for Data Engineer
  1. 5 to 8 years of overall experience.
  2. Proven experience in the development of dimensional models in Azure Synapse with strong SQL knowledge.
  3. Minimum of 3 years working as a Data Engineer in the Azure ecosystem specifically using Synapse, ADF & Data bricks.
  4. Preferably 3 years of experience with data warehousing, ETL development, SQL Queries, Synapse, ADF, PySpark, and Informatica iPaaS for data ingestion & data modeling.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai myZoi

Posted today

Job Viewed

Tap Again To Close

Job Description

Dubai, United Arab Emirates | Posted on 07/29/2025

myZoiis changing lives for the better for those who deserve it the most. We are an excitingfintech start-up aiming to promote financial inclusion globally. Our vision isto provide a level playing field to the unbanked and the underbanked inaccessing essential financial services in an affordable, convenient, andtransparent fashion. We are looking for smart, ambitious, and purpose-drivenindividuals to join us in this journey.Please apply via the link below ifyou are interested.

TheRole

You will beworking in our Data Platform team, providing data capability for internal andproduct requirements for myZoi. You will be proactive and innovative and youwill be using 100% cloud technologies based on AWS and modern Open Sourcetooling to provide a real-time data infrastructure, allowing our teams to gainunprecedented insight into our wealth of application data. You will work with aworld-class team of Data Analysts and Engineers to provide best in classsolutions.

Responsibilities

ArchitectAWS-Centric Data Solutions:

  • Designand optimize high-performance data pipelines leveraging AWS native tools.
  • Architectmodular, AI-ready data lake with a roadmap to ensure secure ingestion,transformation, and consumption workflows.
  • Implementscalable streaming solutions that factor in performance, scalability and cost.

EmbedSecurity & Compliance Across AWS Workloads

  • Buildand enforce data governance protocols aligned with relevant regulatory andcompliance requirements using AWS tools.
  • Collaboratewith cybersecurity teams to implement IAM best practices, encryptionstrategies, and secure networking.
  • Maintaintraceability and auditability for all data flows across the AWS stack.

Optimizefor Observability & Cost Efficiency:

  • Workwith our Cloud Architect and SRE to deploy and fine-tune monitoring dashboardsusing Datadog and AWS CloudWatch for performance, anomaly detection, andsecurity event correlation.
  • Continuouslyevaluate storage and compute cost optimization across S3, EC2, Redshift, andGlue workloads.

LeadThrough Influence and Collaboration:

  • Partnerwith Data Science, Cloud Architect, Security and Engineering leads to aligncloud architecture with evolving business goals and priorities to ensurefuture-readiness.
  • Mentorjunior engineers in AWS best practices, scalable design, and secure codingstandards.
  • Leadinnovation across key Product initiatives.

Innovatewith Purpose:

  • Evaluateand integrate AWS-compatible orchestration tools like Airflow, Lakeformation,ECS, EKS or Managed Workflows.
  • Contributeto middleware and third-party orchestration strategies through secure APIs andevent-driven patterns.
  • Designdata products based on requirements that focus on key use cases such as socialimpact related.
Required Qualifications
  • Bachelor’sor Master’s degree in Computer Science, Data Engineering, Software Engineeringor a related field.
  • 8–10years of professional experience in data engineering, including 5+ yearsarchitecting on AWS underpinned by data governance. Mastery of AWS cloudservices (S3, Lambda, Glue, Redshift, Kinesis, Lake Formation, Crawler etc.).
  • Deepexpertise in building scalable cloud-native solutions and managing secure datainfrastructure ensuring data governance.
  • Strongcommand of compliance-driven architecture design and real-time monitoringstrategies.
  • Goodunderstanding of compliance frameworks related to data privacy and informationsecurity.
  • Excellentcommunication skills and a proven leadership in mentoring and ability to leadcross-functional initiatives.
Technical Requirements
  • Proficiencywith agile tools (Jira).
  • CloudInfrastructure & AWS Services: S3, Glue, Lambda, Redshift, Kinesis, IAM,CloudWatch, Lake Formation etc. Strong awareness of AWS security tools.
  • DataOrchestration: Experience with Apache Airflow on ECS or AWS Managed Workflows.Familiarity with Step Functions and event-driven orchestration patterns.
  • Streaming& ETL Pipelines: Expertise in Kinesis Data Streams and Kafka (AWS-hosted orcompatible). Proficiency in designing and optimizing ETL workflows using AWS.
  • Monitoring& Observability: Awareness of or exposure to logs, alerting, monitoring,detection and tuning.
  • Security& Governance: Awareness of or exposure to AWS KMS. In addition, buildinggovernance workflows with AWS Config and Lake Formation.
  • DataModeling & Optimization: Extensive experience in design of AI-ready datalake with scalable ingestion and query performance.
  • ProgrammingLanguages: Advanced coding in Python and SQL. Experience in Java and ETLprocesses is also preferred.
AboutYou
  • Youhave strong communication skills, curiosity and are a quick learner.
  • Youenjoy a creative fast paced agile world.
  • Youenjoy mentoring and teaching other developers to create a world class cohesiveteam.
  • Youenjoy making friends and having fun.

AtmyZoi we strive to create a both a product and a team that embraces equality,inclusion, diversity and freedom. We want people who can be themselves andbring their own brand of value to the team. Come and join us!

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai WEbook, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Do you want to love what you do at work? Do you want to make a difference, an impact, and transform people's lives? Do you want to work with a team that believes in disrupting the normal, boring, and average?

If yes, then this is the job for you. webook.com is Saudi’s #1 event ticketing and experience booking platform in terms of technology, features, agility, and revenue, serving some of the largest mega events in the Kingdom with over 2 billion in sales. webook.com is part of the Supertech Group, which also includes UXBERT Labs, one of the best digital and user experience design agencies in the GCC, along with Kafu Games, the largest esports tournament platform in MENA.

Key Responsibilities:

  1. Data Integration and ETL Development: Architect and implement robust data integration pipelines to extract, transform, and load data from various sources (e.g., databases, SaaS applications, APIs, and flat files) into a centralized data platform. Design and develop complex ETL processes to ensure data quality, consistency, and reliability. Optimize data transformation workflows for performance and scalability.
  2. Data Infrastructure and Platform Management: Implement and maintain data ingestion, processing, and storage solutions to support data and analytics needs. Ensure data infrastructure's reliability, security, and availability through monitoring, troubleshooting, and disaster recovery planning.
  3. Data Governance and Metadata Management: Collaborate with the data governance team to establish policies, standards, and procedures. Develop and maintain metadata management systems for data lineage, provenance, and traceability. Implement data quality control measures and validation processes to ensure data integrity.

Minimum Requirements:

  • 5-6 years of experience as a Data Engineer or in a related data-driven role.
  • Proficient in designing and implementing data pipelines using tools like Apache Airflow, Airbyte, or cloud-based services.
  • Strong experience with data infrastructure such as data lakes, data warehouses, and real-time streaming platforms (e.g., Elastic, Google BigQuery, MongoDB).
  • Expertise in data modeling, data quality, and metadata management.
  • Proficient in programming languages like Python or Java, and SQL.
  • Familiar with cloud platforms (AWS, Google Cloud) and DevOps practices.
  • Excellent problem-solving skills and ability to work collaboratively across teams.
  • Strong communication skills to translate technical concepts to stakeholders.

Preferred Qualifications:

  • Experience with data visualization and BI tools (e.g., Tableau, Qlik).
  • Knowledge of machine learning and AI applications in data initiatives.
  • Project management experience and leadership in data projects.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Abu Dhabi, Abu Dhabi Contango

Posted today

Job Viewed

Tap Again To Close

Job Description

Tasks

About the Role

We are an emerging AI-native product-driven, agile start-up under Abu Dhabi government AND we are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms, pipelines, and ML enablement within a Databricks on Azure environment.

As part of a stream-aligned delivery team, you’ll work closely with Data Scientists, Architects, and Product Managers to build scalable, high-quality data solutions for clients. You'll be empowered by a collaborative environment that values continuous learning, Agile best practices, and technical excellence.

Ideal candidates have strong hands-on experience in Databricks, Python, ADF and are comfortable in fast-paced, client-facing consulting engagements.

Skills and Experience requirements
1. Technical

  • Databricks (or similar) e.g. Notebooks (Python, SQL), Delta Lake, job scheduling, clusters, and workspace management, Unity Catalog, access control awareness
  • Cloud data engineering – ideally Azure, including storage (e.g., ADLS, S3, ADLS), compute, and secrets management
  • Development languages such as Python, SQL, C#, javascript etc. especially data ingestion, cleaning, and transformation
  • ETL / ELT – including structured logging, error handling, reprocessing strategies, APIs, flat files, databases, message queues, event streaming, event sourcing etc.
  • Automated testing (ideally TDD), pairing/mobbing. Trunk Based Development, Continuous Deployment and Infrastructure-as-Code (Terraform)
  • Git and CI/CD for notebooks, data pipelines, and deployments

2. Integration & Data Handling

  • Experienced in delivering platforms for clients – including file transfer, APIS (REST etc.), SQL/NoSQL/graph databases, JSON, CSV, XML, Parquet etc
  • Data validation and profiling - assess incoming data quality. Cope with schema drift, deduplication, and reconciliation
  • Testing and monitoring pipelines: Unit tests for transformations, data checks, and pipeline observability

3. Working Style

  • Comfortable leveraging the best of lean, agile and waterfall approaches. Can contribute to planning, estimation, and documentation, but also collaborative daily re-prioritisation
  • Able to explain technical decisions to teammates or clients
  • Documents decisions and keeps stakeholders informed
  • Comfortable seeking support from other teams for Product, Databricks, Data architecture
  • Happy to collaborate with Data Science team on complex subsystems
Requirements

Nice-to-haves

  • MLflow or light MLOps experience (for the data science touchpoints)
  • Dbt / dagster / airflow or similar transformation tools
  • Understanding of security and compliance (esp. around client data)
  • Past experience in consulting or client-facing roles

Candidate Requirements

  • 5–8 years (minimum 3–4 years hands-on with cloud/data engineering, 1–2 years in Databricks/Azure, and team/project leadership exposure)
  • Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, Information Systems, Data Engineering

Job Type: Full-time

Benefits

Visa, Insurance, Yearly Flight Ticket, Bonus scheme, relocation logistics covered

Interviewing process consists of 2 or 3 technical/behavioral interviews

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Abu Dhabi, Abu Dhabi DeepLight

Posted today

Job Viewed

Tap Again To Close

Job Description

Get AI-powered advice on this job and more exclusive features.

About DeepLight:

DeepLight is a pioneering AI company committed to pushing the boundaries of innovation in artificial intelligence. Our mission is to harness the power of data and machine learning to revolutionize industries and create a brighter future. With a dynamic team of experts and a culture of relentless innovation, we are at the forefront of AI research and development.

About DeepLight:

DeepLight is a pioneering AI company committed to pushing the boundaries of innovation in artificial intelligence. Our mission is to harness the power of data and machine learning to revolutionize industries and create a brighter future. With a dynamic team of experts and a culture of relentless innovation, we are at the forefront of AI research and development.

Position Overview:

DeepLight is seeking an exceptional Data Engineer to join our team of AI specialists in the UAE. As an Expert Data Engineer, you will be responsible for designing, implementing, and optimizing data pipelines and infrastructure to support our cutting-edge AI systems. You will collaborate closely with our multidisciplinary team to ensure the efficient collection, storage, processing, and analysis of large-scale data, enabling us to unlock valuable insights and drive innovation across various domains.

Requirements

  • Pipeline Development: Design, develop, and maintain scalable and reliable data pipelines to ingest, transform, and load diverse datasets from various sources, including structured and unstructured data, streaming data, and real-time feeds, with consideration for downstream AI/ML workloads
  • Data Integration: Implement robust data integration processes to seamlessly integrate data from different sources, ensuring consistency, reliability, and data quality for analytics and AI use cases
  • Data Storage: Design and optimize data storage solutions, including relational databases, NoSQL databases, data lakes, and cloud storage services, to efficiently store and manage large volumes of data for AI and machine learning model consumption
  • Performance Optimization: Optimize data processing and query performance to enhance system scalability, reliability, and efficiency, leveraging techniques such as indexing, partitioning, caching, and parallel processing—especially for AI model training and inference pipelines
  • Data Governance: Implement data governance frameworks to ensure data security, privacy, integrity, and compliance with regulatory requirements, including data encryption, access controls, and auditing—crucial for responsible AI deployment
  • Monitoring and Maintenance: Monitor data pipelines and infrastructure components, proactively identify and address issues, and perform routine maintenance tasks to ensure system stability and reliability across AI and data science environments
  • Collaboration: Collaborate closely with cross-functional teams, including data scientists, ML engineers, architects, and domain experts, to understand AI/ML requirements, gather insights, and deliver integrated, production-ready data solutions
  • Documentation: Create comprehensive documentation, including technical specifications, data flow diagrams, and operational procedures, to facilitate understanding, collaboration, and knowledge sharing across AI and analytics teams
  • Proven experience as a Data Engineer, with a track record of designing and implementing complex data pipelines and infrastructure solutions that support advanced analytics and AI initiatives
  • Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data warehousing concepts, with proficiency in SQL and scripting languages (e.g., Python, Scala)—especially in AI pipeline preparation and feature engineering
  • Strong hands-on experience with big data technologies and frameworks, such as Hadoop, Spark, Kafka, and Flink, as well as cloud platforms (e.g., AWS, Azure, GCP) commonly used for AI workloads
  • Familiarity with containerization and orchestration technologies, such as Docker and Kubernetes, and DevOps practices for CI/CD (Continuous Integration/Continuous Deployment), including ML model deployment and data pipeline automation
  • Experience supporting AI/ML initiatives, including the preparation of training datasets, model input/output data flows, MLOps integration, and experimentation tracking
  • Excellent analytical, problem-solving, and communication skills, with the ability to translate complex technical concepts into clear and actionable insights
  • Proven ability to work effectively in a fast-paced, collaborative environment, with a passion for innovation, continuous learning, and contributing to AI-driven solutions

Benefits

Why Join DeepLight?
  • Impact: Be part of a dynamic team that is shaping the future of AI and making a meaningful impact on industries and society
  • Innovation: Work on cutting-edge projects at the intersection of AI, data engineering, and machine learning, leveraging the latest technologies and methodologies
  • Collaboration: Collaborate with a diverse team of experts from various disciplines, fostering creativity, learning, and growth
  • Opportunity: Enjoy ample opportunities for professional development, career advancement, and leadership roles in a rapidly growing company
  • Culture: Join a culture of curiosity, excellence, and collaboration, where your ideas are valued, and your contributions are recognized and rewarded

If you are passionate about data engineering, AI, and innovation, and you thrive in a dynamic and collaborative environment, we want to hear from you Apply now to join DeepLight and be part of our journey to unlock the potential of AI for a brighter future.Seniority level
  • Seniority level Mid-Senior level
Employment type
  • Employment type Full-time
Job function
  • Job function Information Technology
  • Industries IT Services and IT Consulting

Referrals increase your chances of interviewing at Deeplight AI by 2x

Get notified about new Data Engineer jobs in Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates.

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 days ago

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 10 months ago

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

Abu Dhabi Emirate, United Arab Emirates 3 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 3 days ago

Abu Dhabi Emirate, United Arab Emirates 4 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 8 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi Emirate, United Arab Emirates 1 week ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi Emirate, United Arab Emirates 2 weeks ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 9 months ago

AI Software Solutions Engineer - Abu Dhabi, UAE

Abu Dhabi Emirate, United Arab Emirates 3 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 days ago

Abu Dhabi Industrial City, Abu Dhabi Emirate, United Arab Emirates AED5,000.00-AED20, days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED5,000.00-AED20, days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Data Analyst - graduated in IT (UAE National)

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 11 months ago

We're unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai VAM Systems

Posted today

Job Viewed

Tap Again To Close

Job Description

We are currently looking forData Engineer for ourUAE operations with the following Skill set and terms & conditions.

Technical Skill Sets :

  • Data models with good knowhow of various data modelling techniques like star schema snow flake schema dimension modelling etc
  • Expert in Database stored procedures and Structured Query Language PLSQL Java
  • ETL Tools Informatica Microsoft SSIS
  • Business intelligence tools SAP Business objects Microsoft Power BI
  • Core banking systems (Preferred) Finacle HPS Power Card

Soft Skills

  • Strong analytical and problem-solving abilities.
  • Excellent communication skills in English both written and verbal.
  • Ability to work collaboratively in a team environment.

Preferred Qualifications

  • Understanding of the UAE Banking Regulatory Framework Submissions (BRF).
  • Familiarity with the CBUAEs SupTech initiative and its objectives.
  • Experience in automating regulatory reporting processes.

Joining time frame:2 weeks (maximum 1 month)

Additional Information :

Terms and conditions:

Joining time frame: maximum 4 weeks

Remote Work :

No

Employment Type :

Full-time

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in United Arab Emirates !

Data Engineer

Dubai, Dubai GSSTech Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Data Engineer (PySpark)

___

About the Role

We are seeking a highly skilled Data Engineer with deep expertise in PySpark and the Cloudera Data Platform (CDP) to join our data engineering team. As a Data Engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines that ensure high data quality and availability across the organization. This role requires a strong background in big data ecosystems, cloud-native tools, and advanced data processing techniques.

The ideal candidate has hands-on experience with data ingestion, transformation, and optimization on the Cloudera Data Platform, along with a proven track record of implementing data engineering best practices. You will work closely with other data engineers to build solutions that drive impactful business insights.

Responsibilities

  1. Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy.
  2. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP.
  3. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements.
  4. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes.
  5. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline.
  6. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem.
  7. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes.
  8. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives.
  9. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations.

Qualifications

Education and Experience

  1. Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
  2. 3+ years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform.

Technical Skills

  1. PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques.
  2. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase.
  3. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala).
  4. Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools.
  5. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks.
  6. Scripting and Automation: Strong scripting skills in Linux.

Soft Skills

  1. Strong analytical and problem-solving skills.
  2. Excellent verbal and written communication abilities.
  3. Ability to work independently and collaboratively in a team environment.
  4. Attention to detail and commitment to data quality.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai myZoi

Posted today

Job Viewed

Tap Again To Close

Job Description

Dubai, United Arab Emirates | Posted on 07/29/2025

myZoiis changing lives for the better for those who deserve it the most. We are an excitingfintech start-up aiming to promote financial inclusion globally. Our vision isto provide a level playing field to the unbanked and the underbanked inaccessing essential financial services in an affordable, convenient, andtransparent fashion. We are looking for smart, ambitious, and purpose-drivenindividuals to join us in this journey.Please apply via the link below ifyou are interested.

TheRole

You will beworking in our Data Platform team, providing data capability for internal andproduct requirements for myZoi. You will be proactive and innovative and youwill be using 100% cloud technologies based on AWS and modern Open Sourcetooling to provide a real-time data infrastructure, allowing our teams to gainunprecedented insight into our wealth of application data. You will work with aworld-class team of Data Analysts and Engineers to provide best in classsolutions.

Responsibilities

ArchitectAWS-Centric Data Solutions:

  • Designand optimize high-performance data pipelines leveraging AWS native tools.
  • Architectmodular, AI-ready data lake with a roadmap to ensure secure ingestion,transformation, and consumption workflows.
  • Implementscalable streaming solutions that factor in performance, scalability and cost.

EmbedSecurity & Compliance Across AWS Workloads

  • Buildand enforce data governance protocols aligned with relevant regulatory andcompliance requirements using AWS tools.
  • Collaboratewith cybersecurity teams to implement IAM best practices, encryptionstrategies, and secure networking.
  • Maintaintraceability and auditability for all data flows across the AWS stack.

Optimizefor Observability & Cost Efficiency:

  • Workwith our Cloud Architect and SRE to deploy and fine-tune monitoring dashboardsusing Datadog and AWS CloudWatch for performance, anomaly detection, andsecurity event correlation.
  • Continuouslyevaluate storage and compute cost optimization across S3, EC2, Redshift, andGlue workloads.

LeadThrough Influence and Collaboration:

  • Partnerwith Data Science, Cloud Architect, Security and Engineering leads to aligncloud architecture with evolving business goals and priorities to ensurefuture-readiness.
  • Mentorjunior engineers in AWS best practices, scalable design, and secure codingstandards.
  • Leadinnovation across key Product initiatives.

Innovatewith Purpose:

  • Evaluateand integrate AWS-compatible orchestration tools like Airflow, Lakeformation,ECS, EKS or Managed Workflows.
  • Contributeto middleware and third-party orchestration strategies through secure APIs andevent-driven patterns.
  • Designdata products based on requirements that focus on key use cases such as socialimpact related.
Required Qualifications
  • Bachelor'sor Master's degree in Computer Science, Data Engineering, Software Engineeringor a related field.
  • 8–10years of professional experience in data engineering, including 5+ yearsarchitecting on AWS underpinned by data governance. Mastery of AWS cloudservices (S3, Lambda, Glue, Redshift, Kinesis, Lake Formation, Crawler etc.).
  • Deepexpertise in building scalable cloud-native solutions and managing secure datainfrastructure ensuring data governance.
  • Strongcommand of compliance-driven architecture design and real-time monitoringstrategies.
  • Goodunderstanding of compliance frameworks related to data privacy and informationsecurity.
  • Excellentcommunication skills and a proven leadership in mentoring and ability to leadcross-functional initiatives.
Technical Requirements
  • Proficiencywith agile tools (Jira).
  • CloudInfrastructure & AWS Services: S3, Glue, Lambda, Redshift, Kinesis, IAM,CloudWatch, Lake Formation etc. Strong awareness of AWS security tools.
  • DataOrchestration: Experience with Apache Airflow on ECS or AWS Managed Workflows.Familiarity with Step Functions and event-driven orchestration patterns.
  • Streaming& ETL Pipelines: Expertise in Kinesis Data Streams and Kafka (AWS-hosted orcompatible). Proficiency in designing and optimizing ETL workflows using AWS.
  • Monitoring& Observability: Awareness of or exposure to logs, alerting, monitoring,detection and tuning.
  • Security& Governance: Awareness of or exposure to AWS KMS. In addition, buildinggovernance workflows with AWS Config and Lake Formation.
  • DataModeling & Optimization: Extensive experience in design of AI-ready datalake with scalable ingestion and query performance.
  • ProgrammingLanguages: Advanced coding in Python and SQL. Experience in Java and ETLprocesses is also preferred.
AboutYou
  • Youhave strong communication skills, curiosity and are a quick learner.
  • Youenjoy a creative fast paced agile world.
  • Youenjoy mentoring and teaching other developers to create a world class cohesiveteam.
  • Youenjoy making friends and having fun.

AtmyZoi we strive to create a both a product and a team that embraces equality,inclusion, diversity and freedom. We want people who can be themselves andbring their own brand of value to the team. Come and join us

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Solanalytics

Posted today

Job Viewed

Tap Again To Close

Job Description

Job DescriptionData Pipeline Architecture and Development:
  • Architect and optimize scalable data storage solutions, including data lakes, warehouses, and NoSQL databases, supporting large-scale analytics.
  • Design and maintain efficient data pipelines using technologies such as Apache Spark, Kafka, Fabric Data Factory, and Airflow, based on cross-functional team requirements.
Data Integration and ETL:
  • Develop robust ETL processes for reliable data ingestion, utilizing tools like SSIS, ADF, and custom Python scripts to ensure data quality and streamline workflows.
  • Optimize ETL performance through techniques like partitioning and parallel processing.
Data Modeling and Schema Design:
  • Define and implement data models and schemas for structured and semi-structured sources, ensuring consistency and efficiency while collaborating with data teams to optimize performance.
Data Governance, Security, and Compliance:
  • Establish and enforce data governance policies, ensuring data quality, security, and compliance with regulations, using tools like Microsoft SQL Server.
  • Implement access controls, encryption, and auditing to protect sensitive data and collaborate with IT to address vulnerabilities.
Infrastructure Management and Optimization:
  • Manage and optimize cloud and on-prem infrastructure for data processing, monitor system performance, and implement disaster recovery enhancements.
  • Leverage automation for provisioning, configuration, and deployment to improve operational efficiency.
Team Leadership and Mentorship:
  • Provide technical leadership, mentoring team members in best practices and cloud technologies, while aligning data engineering initiatives with strategic goals.
Skills Required
  • Bachelor's degree or higher in Software Engineering, Computer Science, Engineering, or a related field.
  • 3-5 years of experience in data engineering, with a proven history of designing and implementing complex data infrastructure.
  • Proficient in Python, Scala, or Java, with experience in scalable, distributed systems.
  • Strong knowledge of cloud computing platforms and related services like AWS Glue, Azure Data Factory, or Google Dataflow.
  • Expertise in data modeling, schema design, and SQL query optimization for both relational and NoSQL databases.
  • Excellent communication and leadership skills, with the ability to collaborate effectively with cross-functional teams and stakeholders.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs