257 Etl Developer jobs in the United Arab Emirates

ETL Developer

Dubai, Dubai VAM Systems

Posted today

Job Viewed

Tap Again To Close

Job Description

We are currently looking for ETL Developer - Banking for our UAE operations with the following Skill set and terms & conditions.

Technical Skill Sets :

Data models with good knowhow of various data modelling techniques like star schema snow flake schema dimension modelling etc

Expert in Database stored procedures and Structured Query Language PLSQL Java

ETL Tools Informatica Microsoft SSIS

Business intelligence tools SAP Business objects Microsoft Power BI

Core banking systems (Preferred) Finacle HPS Power Card

Soft Skills

Strong analytical and problem-solving abilities.

Excellent communication skills in English both written and verbal.

Ability to work collaboratively in a team environment.

Preferred Qualifications

Understanding of the UAE Banking Regulatory Framework Submissions (BRF).

Familiarity with the CBUAEs SupTech initiative and its objectives.

Experience in automating regulatory reporting processes.

Key Responsibilities

Automation Development : Design and implement automation solutions for BRF submissions focusing on data extraction transformation and loading (ETL) processes.

Database Management : Develop and optimize SQL/PLSQL/JAVA/HTML scripts to interface with our core banking system FinCore and its various modules ensuring accurate and timely data retrieval.

Collaboration : Work closely with the Business Intelligence Manager/Unit to translate functional requirements into technical specifications.

Reporting Tools Integration : Assist in integrating and configuring reporting tools (e.g. Business Objects or other platforms) to streamline report generation.

Compliance Alignment : Ensure all automation processes comply with CBUAEs SupTech standards and data governance policies.

Documentation : Maintain comprehensive documentation of developed solutions including technical specifications and user guides.

Support & Maintenance : Provide ongoing support for deployed solutions addressing any issues and implementing enhancements as needed

Joining time frame: 2 weeks (maximum 1 month

Additional Information :

Terms and conditions:

Joining time frame: maximum 4 weeks

Remote Work :

No

Employment Type :

Full-time

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior ETL Developer

Abu Dhabi, Abu Dhabi Dicetek LLC

Posted today

Job Viewed

Tap Again To Close

Job Description

Join or sign in to find your next job

Join to apply for the Senior ETL Developer role at Dicetek LLC

Continue with Google Continue with Google

Join to apply for the Senior ETL Developer role at Dicetek LLC

Get AI-powered advice on this job and more exclusive features.

Sign in to access AI-powered advices

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Job Description

We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. The ideal candidate will have extensive experience with Informatica BDM and Databricks pipeline, along with strong knowledge of SQL and PowerShell. The candidate should be proficient in designing ETL workflows and possess excellent communication skills. An understanding of data modeling and DAX queries is a plus.

Job Description

We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. The ideal candidate will have extensive experience with Informatica BDM and Databricks pipeline, along with strong knowledge of SQL and PowerShell. The candidate should be proficient in designing ETL workflows and possess excellent communication skills. An understanding of data modeling and DAX queries is a plus.

Key Responsibilities

  • Design, develop, and maintain ETL processes using Informatica BDM and Databricks pipeline.
  • Collaborate with data architects and business analysts to understand data requirements and translate them into ETL solutions.
  • Optimize and troubleshoot ETL workflows to ensure high performance and reliability.
  • Develop and maintain scripts using Oracle SQL and PowerShell for data extraction, transformation, and loading.
  • Ensure data quality and integrity throughout the ETL process.
  • Document ETL processes, workflows, and data mappings.
  • Communicate effectively with team members, stakeholders, and management to provide updates and gather requirements.
  • Utilize data modeling techniques and DAX queries to enhance data analysis and reporting capabilities.
  • Leverage Azure services and tools to support ETL processes and data integration.

Qualifications
  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 7 years of experience in ETL development.
  • Strong experience with Informatica BDM and Databricks pipeline.
  • Proficient in Oracle SQL and PowerShell.
  • Experience in designing and optimizing ETL workflows.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and interpersonal skills.
  • Ability to work independently and as part of a team.
  • Understanding of data modeling and DAX queries is an added advantage.
  • Experience with Azure services and tools.

Preferred Qualifications
  • Knowledge of data warehousing concepts and best practices.
  • Certification in Informatica, Databricks, or Azure is a plus.
Seniority level
  • Seniority level Not Applicable
Employment type
  • Employment type Contract
Job function
  • Job function Business Development and Sales
  • Industries IT Services and IT Consulting

Referrals increase your chances of interviewing at Dicetek LLC by 2x

Sign in to set job alerts for "Senior ETL Developer" roles.

Continue with Google Continue with Google

Continue with Google Continue with Google

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED3,000 - AED7,000 2 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 9 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 6 months ago

Abu Dhabi Emirate, United Arab Emirates 1 week ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 11 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates
AED12,000.00
-
AED13,000.00
3 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 9 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago

Systems Engineer - Avionics Software Development

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 days ago

Abu Dhabi Emirate, United Arab Emirates 2 weeks ago

Software Developer ( .NET core, Angular)

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 months ago

Full Stack Java Developer - Banking (m/f/d)

Abu Dhabi Emirate, United Arab Emirates 1 day ago

Front-End Developer - React Native and React (Web)

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

API Developer-Terraform, Azure, and API Gateway for Microservices

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED14,000 - AED15,000 3 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED14,000 - AED18,000 3 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago

Abu Dhabi Emirate, United Arab Emirates 2 weeks ago

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 19 hours ago

We're unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior ETL Developer

Dubai, Dubai beBeeData

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities

Skillfully design, develop, and maintain ETL solutions using SAP BusinessObjects Data Services (BODS), adhering to best practices and industry standards.

  • Architect, configure, and optimize SAP BODS environments, including understanding system architecture, components, and project organization.
  • Build robust data flows and workflows within BODS Designer, organizing projects for clarity, scalability, and maintainability.
  • Create, schedule, and troubleshoot batch jobs to ensure timely and accurate data integration across multiple systems and sources.
  • Leverage Data Integrator and Platform Transforms (lookup, merge, validation, etc.) to meet complex business and technical requirements.
  • Implement and monitor data quality transforms, ensuring clean, reliable, and accurate data throughout the ETL lifecycle.
  • Utilize functions, scripts, and variables within BODS for dynamic data manipulation and to support advanced transformation logic.
  • Perform thorough data profiling and assessment to identify data issues, inconsistencies, and opportunities for data quality improvement.
  • Design and implement robust error and exception handling mechanisms within ETL processes to ensure resilience and reliability.
  • Develop and optimize complex SQL and Oracle queries, including views and stored procedures, ensuring efficient data extraction and transformation.
  • Collaborate with BI/reporting teams, maintaining basic familiarity with reporting tools such as SAP BO, Tableau, or Power BI to support downstream analytics requirements.
  • Apply performance tuning and optimization techniques to enhance ETL job efficiency, minimize runtime, and ensure scalability.
Required Skills and Qualifications

Proficient in:

  • Extensive SAP BODS Expertise: In-depth knowledge of SAP BusinessObjects Data Services, its architecture, and its core components.
  • BODS Designer Proficiency: Expert in data flows, workflows, and advanced project organization within BODS Designer.
  • Batch Job Management: Proven ability to create, schedule, and troubleshoot batch jobs for high-volume data integration.
  • Transformations Knowledge: Solid experience with Data Integrator and Platform Transforms such as lookup, merge, and validation transforms.
  • Data Quality Implementation: Capability to define and apply data quality transforms to ensure data accuracy and reliability.
  • Scripting & Functions: Proficient in BODS scripting, functions, and variable handling for dynamic ETL logic.
  • Data Assessment Skills: Skilled in data profiling, data quality assessment, and remediation of identified data issues.
  • Error & Exception Handling: Experience in designing resilient error and exception handling processes within ETL workflows.
  • SQL/Oracle Expertise: Strong command of SQL and Oracle database concepts, query performance tuning, and complex stored procedure development.
  • Reporting Tools Awareness: Familiarity with BI/reporting solutions (SAP BO, Tableau, Power BI) to understand integration requirements.
  • Performance Optimization: Knowledge of ETL performance tuning techniques and experience in optimizing large-scale data flows.
  • Learning Mindset: Demonstrated willingness to learn new tools and technologies, and ability to cross-skill as needed.
  • Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related discipline.
  • Relevant certifications in SAP BODS and/or Oracle databases are an advantage.
About Our Company

We're a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world. We deliver end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by our market-leading capabilities in AI, cloud, and data, combined with our deep industry expertise and partner ecosystem.

This advertiser has chosen not to accept applicants from your region.

Senior ETL Developer

Abu Dhabi, Abu Dhabi beBeeData

Posted today

Job Viewed

Tap Again To Close

Job Description

ETL Data Integration Specialist

">
    ">
  • Design, develop and maintain ETL processes to transform data from various sources into a unified format for business intelligence purposes.
  • ">
">Key Responsibilities: ">
    ">
  • Ongoing development of complex ETL workflows using Informatica BDM and Databricks pipeline.
  • ">
  • Collaboration with data architects and business analysts to translate business requirements into ETL solutions.
  • ">
  • Optimization and troubleshooting of ETL processes to ensure high performance and reliability.
  • ">
  • Scripting using Oracle SQL and PowerShell for data extraction, transformation, and loading.
  • ">
  • Data quality and integrity throughout the ETL process.
  • ">
  • Educating stakeholders on the importance of ETL best practices and compliance standards.
  • ">
  • Communication with team members and management regarding updates and project status.
  • ">
  • Adherence to Azure service level agreements and cloud security guidelines.
  • ">
">Requirements: ">
    ">
  • Bachelor's degree in Computer Science or related field.
  • ">
  • A minimum of 7 years of experience in ETL development.
  • ">
  • Proficient in Informatica BDM and Databricks pipeline.
  • ">
  • Strong expertise in Oracle SQL and PowerShell.
  • ">
  • Excellent problem-solving skills and attention to detail.
  • ">
  • Effective communication and interpersonal skills.
  • ">
  • Ability to work independently and as part of a collaborative team environment.
  • ">
  • Understanding of data modeling techniques and DAX queries is an added advantage.
  • ">
">

Skillful candidates will demonstrate their technical expertise, creative problem-solving abilities, and ability to work effectively within a dynamic team environment.

">

This role involves collaborating with cross-functional teams, providing timely support and maintaining ongoing knowledge of emerging trends and technologies in data integration.

"),
This advertiser has chosen not to accept applicants from your region.

Highly Skilled ETL Solutions Developer

Dubai, Dubai beBeeDataEngineer

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Our company seeks a highly skilled ETL solutions developer to join our team. The ideal candidate will possess in-depth knowledge of SAP BusinessObjects Data Services, its architecture, and core components.

The successful applicant will design, develop, and maintain ETL solutions using SAP BODS, ensuring adherence to best practices and industry standards. They will also architect, configure, and optimize SAP BODS environments, understanding system architecture, components, and project organization.

The role involves building robust data flows and workflows within BODS Designer, organizing projects for clarity, scalability, and maintainability. Additionally, the individual will create, schedule, and troubleshoot batch jobs to ensure timely and accurate data integration across multiple systems and sources.

The selected candidate will leverage Data Integrator and Platform Transforms (lookup, merge, validation, etc.) to meet complex business and technical requirements. They will also implement and monitor data quality transforms, ensuring clean, reliable, and accurate data throughout the ETL lifecycle.

Responsibilities include utilizing functions, scripts, and variables within BODS for dynamic data manipulation and to support advanced transformation logic. The individual will perform thorough data profiling and assessment to identify data issues, inconsistencies, and opportunities for data quality improvement.

They will design and implement robust error and exception handling mechanisms within ETL processes to ensure resilience and reliability. Furthermore, the selected candidate will develop and optimize complex SQL and Oracle queries, including views and stored procedures, ensuring efficient data extraction and transformation.

Collaboration with BI/reporting teams is essential, maintaining basic familiarity with reporting tools such as SAP BO, Tableau, or Power BI to support downstream analytics requirements.

Performance tuning and optimization techniques will be applied to enhance ETL job efficiency, minimize runtime, and ensure scalability.

A proactive approach to learning new technologies and cross-skilling across domains is required, contributing to a culture of continuous improvement within the team.

Required Skills and Qualifications:

  • Extensive SAP BODS Expertise: In-depth knowledge of SAP BusinessObjects Data Services, its architecture, and its core components.
  • BODS Designer Proficiency: Expert in data flows, workflows, and advanced project organization within BODS Designer.
  • Batch Job Management: Proven ability to create, schedule, and troubleshoot batch jobs for high-volume data integration.
  • Transformations Knowledge: Solid experience with Data Integrator and Platform Transforms such as lookup, merge, and validation transforms.
  • Data Quality Implementation: Capability to define and apply data quality transforms to ensure data accuracy and reliability.
  • Scripting & Functions: Proficient in BODS scripting, functions, and variable handling for dynamic ETL logic.
  • Data Assessment Skills: Skilled in data profiling, data quality assessment, and remediation of identified data issues.
  • Error & Exception Handling: Experience in designing resilient error and exception handling processes within ETL workflows.
  • SQL/Oracle Expertise: Strong command of SQL and Oracle database concepts, query performance tuning, and complex stored procedure development.
  • Reporting Tools Awareness: Familiarity with BI/reporting solutions (SAP BO, Tableau, Power BI) to understand integration requirements.
  • Performance Optimization: Knowledge of ETL performance tuning techniques and experience in optimizing large-scale data flows.
  • Learning Mindset: Demonstrated willingness to learn new tools and technologies, and ability to cross-skill as needed.
  • Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related discipline.
  • Relevant certifications in SAP BODS and/or Oracle databases are an advantage

Benefits:

Joining our team offers numerous benefits. Our company fosters a collaborative environment where professionals can grow, learn, and thrive. As a member of our team, you will have access to cutting-edge technology, ongoing training, and mentorship from experienced colleagues. You will be part of a diverse collective of talented individuals who share a passion for innovation and excellence.

At our company, we believe that a happy and healthy workforce leads to increased productivity and better outcomes. We offer a range of benefits aimed at supporting your well-being, including comprehensive health insurance, flexible working arrangements, and opportunities for professional development.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Teliolabs Communication Private Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Location : Dubai

Who Can Apply: Candidates who are currently in Dubai
Job Type: Contract
Experience: Minimum 8+ years

Job Summary:

We are looking for an experienced Data Engineer to design, develop, and optimize data pipelines, ETL processes, and data integration solutions. The ideal candidate should have expertise in AWS cloud services, data engineering best practices, open-source tools, and data schema design. The role requires hands-on experience with large-scale data processing, real-time data streaming, and cloud-based data architectures.

Key Responsibilities:

  • Develop and Maintain Data Pipelines to process structured and unstructured data efficiently.
  • Implement ETL/ELT Workflows for batch and real-time data processing.
  • Optimize Data Processing Workflows using distributed computing frameworks.
  • Ensure Data Integrity and Quality through data validation, cleaning, and transformation techniques.
  • Work with AWS Cloud Services , including S3, Redshift, Glue, Lambda, DynamoDB, and Kinesis.
  • Leverage Open-Source Tools like Apache Spark, Airflow, Kafka, and Flink for data processing.
  • Manage and Optimize Database Performance for both SQL and NoSQL environments.
  • Collaborate with Data Scientists and Analysts to enable AI/ML model deployment and data accessibility.
  • Support Data Migration Initiatives from on-premise to cloud-based data platforms.
  • Ensure Compliance and Security Standards in handling sensitive and regulated data.
  • Develop Data Models and Schemas for efficient storage and retrieval.

Required Skills & Qualifications:

  • 8+ years of experience in data engineering, data architecture, and cloud computing.
  • Strong knowledge of AWS Services such as Glue, Redshift, Athena, Lambda, and S3.
  • Expertise in ETL Tools , including Talend, Apache NiFi, Informatica, dbt, and AWS Glue.
  • Proficiency in Open-Source Tools such as Apache Spark, Hadoop, Airflow, Kafka, and Flink.
  • Strong Programming Skills in Python, SQL, and Scala.
  • Experience in Data Schema Design , normalization, and performance optimization.
  • Knowledge of Real-time Data Streaming using Kafka, Kinesis, or Apache Flink.
  • Experience in Data Warehouse and Data Lake Solutions .
  • Hands-on experience with DevOps and CI/CD Pipelines for data engineering workflows.
  • Understanding of AI and Machine Learning Data Pipelines .
  • Strong analytical and problem-solving skills .

Preferred Qualifications:

  • AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification.
  • Experience with Kubernetes, Docker, and serverless data processing.
  • Exposure to MLOps and data engineering practices for AI/ML solutions.
  • Experience with distributed computing and big data frameworks.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Everythinginclick

Posted today

Job Viewed

Tap Again To Close

Job Description

The Data Engineer will be responsible for developing semantic models on top of the Data Lake/Data Warehouse to fulfill the self-service BI foundation requirements. This includes data extraction from various data sources and integration into the central data lake/data warehouse using enterprise platforms like Informatica iPaaS.

Key Responsibilities of Data Engineer
  1. Designing data warehouse data models based on business requirements.
  2. Designing, developing, and testing both batch and real-time Extract, Transform and Load (ETL) processes required for data integration.
  3. Ingesting both structured and unstructured data into the SMBU data lake/data warehouse system.
  4. Designing and developing semantic models/self-service cubes.
  5. Performing BI administration and access management to ensure access and reports are properly governed.
  6. Performing unit testing and data validation to ensure business UAT is successful.
  7. Performing ad-hoc data analysis and presenting results in a clear manner.
  8. Assessing data quality of the source systems and proposing enhancements to achieve a satisfactory level of data accuracy.
  9. Optimizing ETL processes to ensure execution time meets requirements.
  10. Maintaining and architecting ETL pipelines to ensure data is loaded on time on a regular basis.
Qualification Required for Data Engineer
  1. 5 to 8 years of overall experience.
  2. Proven experience in the development of dimensional models in Azure Synapse with strong SQL knowledge.
  3. Minimum of 3 years working as a Data Engineer in the Azure ecosystem specifically using Synapse, ADF & Data bricks.
  4. Preferably 3 years of experience with data warehousing, ETL development, SQL Queries, Synapse, ADF, PySpark, and Informatica iPaaS for data ingestion & data modeling.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl developer Jobs in United Arab Emirates !

Data Engineer

Dubai, Dubai myZoi

Posted today

Job Viewed

Tap Again To Close

Job Description

Dubai, United Arab Emirates | Posted on 07/29/2025

myZoiis changing lives for the better for those who deserve it the most. We are an excitingfintech start-up aiming to promote financial inclusion globally. Our vision isto provide a level playing field to the unbanked and the underbanked inaccessing essential financial services in an affordable, convenient, andtransparent fashion. We are looking for smart, ambitious, and purpose-drivenindividuals to join us in this journey.Please apply via the link below ifyou are interested.

TheRole

You will beworking in our Data Platform team, providing data capability for internal andproduct requirements for myZoi. You will be proactive and innovative and youwill be using 100% cloud technologies based on AWS and modern Open Sourcetooling to provide a real-time data infrastructure, allowing our teams to gainunprecedented insight into our wealth of application data. You will work with aworld-class team of Data Analysts and Engineers to provide best in classsolutions.

Responsibilities

ArchitectAWS-Centric Data Solutions:

  • Designand optimize high-performance data pipelines leveraging AWS native tools.
  • Architectmodular, AI-ready data lake with a roadmap to ensure secure ingestion,transformation, and consumption workflows.
  • Implementscalable streaming solutions that factor in performance, scalability and cost.

EmbedSecurity & Compliance Across AWS Workloads

  • Buildand enforce data governance protocols aligned with relevant regulatory andcompliance requirements using AWS tools.
  • Collaboratewith cybersecurity teams to implement IAM best practices, encryptionstrategies, and secure networking.
  • Maintaintraceability and auditability for all data flows across the AWS stack.

Optimizefor Observability & Cost Efficiency:

  • Workwith our Cloud Architect and SRE to deploy and fine-tune monitoring dashboardsusing Datadog and AWS CloudWatch for performance, anomaly detection, andsecurity event correlation.
  • Continuouslyevaluate storage and compute cost optimization across S3, EC2, Redshift, andGlue workloads.

LeadThrough Influence and Collaboration:

  • Partnerwith Data Science, Cloud Architect, Security and Engineering leads to aligncloud architecture with evolving business goals and priorities to ensurefuture-readiness.
  • Mentorjunior engineers in AWS best practices, scalable design, and secure codingstandards.
  • Leadinnovation across key Product initiatives.

Innovatewith Purpose:

  • Evaluateand integrate AWS-compatible orchestration tools like Airflow, Lakeformation,ECS, EKS or Managed Workflows.
  • Contributeto middleware and third-party orchestration strategies through secure APIs andevent-driven patterns.
  • Designdata products based on requirements that focus on key use cases such as socialimpact related.
Required Qualifications
  • Bachelor’sor Master’s degree in Computer Science, Data Engineering, Software Engineeringor a related field.
  • 8–10years of professional experience in data engineering, including 5+ yearsarchitecting on AWS underpinned by data governance. Mastery of AWS cloudservices (S3, Lambda, Glue, Redshift, Kinesis, Lake Formation, Crawler etc.).
  • Deepexpertise in building scalable cloud-native solutions and managing secure datainfrastructure ensuring data governance.
  • Strongcommand of compliance-driven architecture design and real-time monitoringstrategies.
  • Goodunderstanding of compliance frameworks related to data privacy and informationsecurity.
  • Excellentcommunication skills and a proven leadership in mentoring and ability to leadcross-functional initiatives.
Technical Requirements
  • Proficiencywith agile tools (Jira).
  • CloudInfrastructure & AWS Services: S3, Glue, Lambda, Redshift, Kinesis, IAM,CloudWatch, Lake Formation etc. Strong awareness of AWS security tools.
  • DataOrchestration: Experience with Apache Airflow on ECS or AWS Managed Workflows.Familiarity with Step Functions and event-driven orchestration patterns.
  • Streaming& ETL Pipelines: Expertise in Kinesis Data Streams and Kafka (AWS-hosted orcompatible). Proficiency in designing and optimizing ETL workflows using AWS.
  • Monitoring& Observability: Awareness of or exposure to logs, alerting, monitoring,detection and tuning.
  • Security& Governance: Awareness of or exposure to AWS KMS. In addition, buildinggovernance workflows with AWS Config and Lake Formation.
  • DataModeling & Optimization: Extensive experience in design of AI-ready datalake with scalable ingestion and query performance.
  • ProgrammingLanguages: Advanced coding in Python and SQL. Experience in Java and ETLprocesses is also preferred.
AboutYou
  • Youhave strong communication skills, curiosity and are a quick learner.
  • Youenjoy a creative fast paced agile world.
  • Youenjoy mentoring and teaching other developers to create a world class cohesiveteam.
  • Youenjoy making friends and having fun.

AtmyZoi we strive to create a both a product and a team that embraces equality,inclusion, diversity and freedom. We want people who can be themselves andbring their own brand of value to the team. Come and join us!

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai WEbook, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Do you want to love what you do at work? Do you want to make a difference, an impact, and transform people's lives? Do you want to work with a team that believes in disrupting the normal, boring, and average?

If yes, then this is the job for you. webook.com is Saudi’s #1 event ticketing and experience booking platform in terms of technology, features, agility, and revenue, serving some of the largest mega events in the Kingdom with over 2 billion in sales. webook.com is part of the Supertech Group, which also includes UXBERT Labs, one of the best digital and user experience design agencies in the GCC, along with Kafu Games, the largest esports tournament platform in MENA.

Key Responsibilities:

  1. Data Integration and ETL Development: Architect and implement robust data integration pipelines to extract, transform, and load data from various sources (e.g., databases, SaaS applications, APIs, and flat files) into a centralized data platform. Design and develop complex ETL processes to ensure data quality, consistency, and reliability. Optimize data transformation workflows for performance and scalability.
  2. Data Infrastructure and Platform Management: Implement and maintain data ingestion, processing, and storage solutions to support data and analytics needs. Ensure data infrastructure's reliability, security, and availability through monitoring, troubleshooting, and disaster recovery planning.
  3. Data Governance and Metadata Management: Collaborate with the data governance team to establish policies, standards, and procedures. Develop and maintain metadata management systems for data lineage, provenance, and traceability. Implement data quality control measures and validation processes to ensure data integrity.

Minimum Requirements:

  • 5-6 years of experience as a Data Engineer or in a related data-driven role.
  • Proficient in designing and implementing data pipelines using tools like Apache Airflow, Airbyte, or cloud-based services.
  • Strong experience with data infrastructure such as data lakes, data warehouses, and real-time streaming platforms (e.g., Elastic, Google BigQuery, MongoDB).
  • Expertise in data modeling, data quality, and metadata management.
  • Proficient in programming languages like Python or Java, and SQL.
  • Familiar with cloud platforms (AWS, Google Cloud) and DevOps practices.
  • Excellent problem-solving skills and ability to work collaboratively across teams.
  • Strong communication skills to translate technical concepts to stakeholders.

Preferred Qualifications:

  • Experience with data visualization and BI tools (e.g., Tableau, Qlik).
  • Knowledge of machine learning and AI applications in data initiatives.
  • Project management experience and leadership in data projects.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Abu Dhabi, Abu Dhabi Contango

Posted today

Job Viewed

Tap Again To Close

Job Description

Tasks

About the Role

We are an emerging AI-native product-driven, agile start-up under Abu Dhabi government AND we are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms, pipelines, and ML enablement within a Databricks on Azure environment.

As part of a stream-aligned delivery team, you’ll work closely with Data Scientists, Architects, and Product Managers to build scalable, high-quality data solutions for clients. You'll be empowered by a collaborative environment that values continuous learning, Agile best practices, and technical excellence.

Ideal candidates have strong hands-on experience in Databricks, Python, ADF and are comfortable in fast-paced, client-facing consulting engagements.

Skills and Experience requirements
1. Technical

  • Databricks (or similar) e.g. Notebooks (Python, SQL), Delta Lake, job scheduling, clusters, and workspace management, Unity Catalog, access control awareness
  • Cloud data engineering – ideally Azure, including storage (e.g., ADLS, S3, ADLS), compute, and secrets management
  • Development languages such as Python, SQL, C#, javascript etc. especially data ingestion, cleaning, and transformation
  • ETL / ELT – including structured logging, error handling, reprocessing strategies, APIs, flat files, databases, message queues, event streaming, event sourcing etc.
  • Automated testing (ideally TDD), pairing/mobbing. Trunk Based Development, Continuous Deployment and Infrastructure-as-Code (Terraform)
  • Git and CI/CD for notebooks, data pipelines, and deployments

2. Integration & Data Handling

  • Experienced in delivering platforms for clients – including file transfer, APIS (REST etc.), SQL/NoSQL/graph databases, JSON, CSV, XML, Parquet etc
  • Data validation and profiling - assess incoming data quality. Cope with schema drift, deduplication, and reconciliation
  • Testing and monitoring pipelines: Unit tests for transformations, data checks, and pipeline observability

3. Working Style

  • Comfortable leveraging the best of lean, agile and waterfall approaches. Can contribute to planning, estimation, and documentation, but also collaborative daily re-prioritisation
  • Able to explain technical decisions to teammates or clients
  • Documents decisions and keeps stakeholders informed
  • Comfortable seeking support from other teams for Product, Databricks, Data architecture
  • Happy to collaborate with Data Science team on complex subsystems
Requirements

Nice-to-haves

  • MLflow or light MLOps experience (for the data science touchpoints)
  • Dbt / dagster / airflow or similar transformation tools
  • Understanding of security and compliance (esp. around client data)
  • Past experience in consulting or client-facing roles

Candidate Requirements

  • 5–8 years (minimum 3–4 years hands-on with cloud/data engineering, 1–2 years in Databricks/Azure, and team/project leadership exposure)
  • Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, Information Systems, Data Engineering

Job Type: Full-time

Benefits

Visa, Insurance, Yearly Flight Ticket, Bonus scheme, relocation logistics covered

Interviewing process consists of 2 or 3 technical/behavioral interviews

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Developer Jobs