165 Data Engineer jobs in Dubai
Data Engineer
Posted today
Job Viewed
Job Description
The Data Engineer will be responsible for developing semantic models on top of the Data Lake/Data Warehouse to fulfill the self-service BI foundation requirements. This includes data extraction from various data sources and integration into the central data lake/data warehouse using enterprise platforms like Informatica iPaaS.
Key Responsibilities of Data Engineer- Designing data warehouse data models based on business requirements.
- Designing, developing, and testing both batch and real-time Extract, Transform and Load (ETL) processes required for data integration.
- Ingesting both structured and unstructured data into the SMBU data lake/data warehouse system.
- Designing and developing semantic models/self-service cubes.
- Performing BI administration and access management to ensure access and reports are properly governed.
- Performing unit testing and data validation to ensure business UAT is successful.
- Performing ad-hoc data analysis and presenting results in a clear manner.
- Assessing data quality of the source systems and proposing enhancements to achieve a satisfactory level of data accuracy.
- Optimizing ETL processes to ensure execution time meets requirements.
- Maintaining and architecting ETL pipelines to ensure data is loaded on time on a regular basis.
- 5 to 8 years of overall experience.
- Proven experience in the development of dimensional models in Azure Synapse with strong SQL knowledge.
- Minimum of 3 years working as a Data Engineer in the Azure ecosystem specifically using Synapse, ADF & Data bricks.
- Preferably 3 years of experience with data warehousing, ETL development, SQL Queries, Synapse, ADF, PySpark, and Informatica iPaaS for data ingestion & data modeling.
Data Engineer
Posted today
Job Viewed
Job Description
Job Summary:
We are looking for a talented and experienced Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining our data pipelines and infrastructure. You will work closely with our data scientists and analysts to ensure that our data is accessible, reliable, and secure.
Responsibilities:
- Design, develop, and maintain scalable data pipelines.
- Build and manage data warehouses and data lakes.
- Implement data quality and data governance best practices.
- Work with data scientists and analysts to support their research and development projects.
- Collaborate with other engineers to build and maintain our data infrastructure.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field
- 3-5 years of experience in a data engineering role
- Strong programming skills in Python, Java, or Scala
- Experience with big data technologies such as Hadoop, Spark, and Kafka
- Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform
- Experience with database systems such as SQL, PostgreSQL and Cassandra
- Experience with data warehousing and ETL (extract, transform, load) processes
- Excellent problem-solving and analytical skills
- Strong communication and teamwork skills
Bonus Points:
- Experience with SAS, Kubernetes, or Docker
- Experience with machine learning and artificial intelligence
- Experience with cloud-native development
Data Engineer
Posted today
Job Viewed
Job Description
Job Summary:
We are looking for a talented and experienced Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining our data pipelines and infrastructure. You will work closely with our data scientists and analysts to ensure that our data is accessible, reliable, and secure.
Responsibilities:
- Design, develop, and maintain scalable data pipelines.
- Build and manage data warehouses and data lakes.
- Implement data quality and data governance best practices.
- Work with data scientists and analysts to support their research and development projects.
- Collaborate with other engineers to build and maintain our data infrastructure.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field
- 3-5 years of experience in a data engineering role
- Strong programming skills in Python, Java, or Scala
- Experience with big data technologies such as Hadoop, Spark, and Kafka
- Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform
- Experience with database systems such as SQL, PostgreSQL and Cassandra
- Experience with data warehousing and ETL (extract, transform, load) processes
- Excellent problem-solving and analytical skills
- Strong communication and teamwork skills
Bonus Points:
- Experience with SAS, Kubernetes, or Docker
- Experience with machine learning and artificial intelligence
- Experience with cloud-native development
Data Engineer
Posted today
Job Viewed
Job Description
Location : Dubai
Who Can Apply: Candidates who are currently in Dubai
Job Type: Contract
Experience: Minimum 8+ years
Job Summary:
We are looking for an experienced Data Engineer to design, develop, and optimize data pipelines, ETL processes, and data integration solutions. The ideal candidate should have expertise in AWS cloud services, data engineering best practices, open-source tools, and data schema design. The role requires hands-on experience with large-scale data processing, real-time data streaming, and cloud-based data architectures.
Key Responsibilities:
- Develop and Maintain Data Pipelines to process structured and unstructured data efficiently.
- Implement ETL/ELT Workflows for batch and real-time data processing.
- Optimize Data Processing Workflows using distributed computing frameworks.
- Ensure Data Integrity and Quality through data validation, cleaning, and transformation techniques.
- Work with AWS Cloud Services , including S3, Redshift, Glue, Lambda, DynamoDB, and Kinesis.
- Leverage Open-Source Tools like Apache Spark, Airflow, Kafka, and Flink for data processing.
- Manage and Optimize Database Performance for both SQL and NoSQL environments.
- Collaborate with Data Scientists and Analysts to enable AI/ML model deployment and data accessibility.
- Support Data Migration Initiatives from on-premise to cloud-based data platforms.
- Ensure Compliance and Security Standards in handling sensitive and regulated data.
- Develop Data Models and Schemas for efficient storage and retrieval.
Required Skills & Qualifications:
- 8+ years of experience in data engineering, data architecture, and cloud computing.
- Strong knowledge of AWS Services such as Glue, Redshift, Athena, Lambda, and S3.
- Expertise in ETL Tools , including Talend, Apache NiFi, Informatica, dbt, and AWS Glue.
- Proficiency in Open-Source Tools such as Apache Spark, Hadoop, Airflow, Kafka, and Flink.
- Strong Programming Skills in Python, SQL, and Scala.
- Experience in Data Schema Design , normalization, and performance optimization.
- Knowledge of Real-time Data Streaming using Kafka, Kinesis, or Apache Flink.
- Experience in Data Warehouse and Data Lake Solutions .
- Hands-on experience with DevOps and CI/CD Pipelines for data engineering workflows.
- Understanding of AI and Machine Learning Data Pipelines .
- Strong analytical and problem-solving skills .
Preferred Qualifications:
- AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification.
- Experience with Kubernetes, Docker, and serverless data processing.
- Exposure to MLOps and data engineering practices for AI/ML solutions.
- Experience with distributed computing and big data frameworks.
Data Engineer
Posted today
Job Viewed
Job Description
- sql
- kafka
- hadoop
- spark
- mysql
- event streaming
- metadata management
- database optimization
- capital markets domain knowledge
- problem-solving
- documentation
Job Type: Full-time
Job Summary:
Join our team as a Senior Data Engineer - Capital Markets and play a vital role in shaping the data infrastructure for a dynamic stock exchange environment. You will design, implement, and optimize robust data solutions that power advanced business intelligence and real-time insights for our leading financial services institution. Our async culture values written communication, enabling deeper focus and effective collaboration across the business.
Key Responsibilities- Design, develop, and maintain scalable, secure data architectures and ETL pipelines to support business intelligence initiatives.
- Implement and optimize real-time event streaming frameworks and services using technologies such as Kafka.
- Manage and optimize relational and non-relational databases to ensure high performance and availability.
- Leverage big data technologies (Hadoop, Spark, Kafka) to process, analyze, and deliver insights from large volumes of capital markets data.
- Establish and maintain robust metadata management and data lineage practices, ensuring data quality and compliance.
- Collaborate closely with data scientists, business analysts, and stakeholders to understand and deliver on evolving data requirements.
- Lead cross-functional problem solving for data integrity, reliability, and performance issues.
Required Skills and Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related discipline.
- 6-9 years of data engineering experience, including 2-5 years specifically within a leading stock exchange (e.g., Dubai, London, etc.).
- Strong command of SQL and expertise with database management systems (MySQL, PostgreSQL, MongoDB).
- Proficiency in developing end-to-end ETL pipelines and integrating multiple data sources.
- Hands-on experience with big data tools (Hadoop, Spark) and event streaming technologies (Kafka, Event Hub, etc.).
- Solid grasp of DevOps practices relevant to data engineering workflows.
- Exceptional problem-solving abilities and a meticulous approach to data quality and documentation.
- Prior experience working with or supporting data initiatives for capital markets or stock exchanges.
- Demonstrated expertise in creating and managing metadata repositories.
- Advanced knowledge of real-time data streaming and analytics platforms.
Data Engineer
Posted today
Job Viewed
Job Description
CFI Financial Group is an award-winning trading provider, possessing more than 25 years of experience with multiple offices around the world including London, Larnaca, Beirut, Amman, Dubai, Kuwait, Port Louis, and others.
CFI is hiring Make your mark in the online trading industry.
Are you looking to pursue a career in finance? Do you want to work with a dynamic and growing team in the exciting world of online trading and investing? If you answered yes, then we have some amazing opportunities for you
Description :
We are seeking a highly motivated and experienced Data Engineer to join our team at CFI.
Responsibilities- Interpret and analyze data problems.
- Conceive, prioritize, and plan data projects in alignment with organizational goals.
- Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality.
- Acquire data from primary or secondary data sources and maintain databases/data systems.
- Identify, analyze, and interpret trends or patterns in complex data sets.
- Ensure data quality and integrity.
- Maintain the full data science and analysis documentation.
- Collaborate with team members and stakeholders in understanding where to bring value from data.
- Development and implementation of the data science infrastructure, such as experiments tracking, feature store, A/B testing platform, etc.
- BSc in Mathematics, Statistics, Data Science or related field.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Good knowledge of database design development, data models, techniques for data mining, and segmentation.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- Hands-on experience with Python is a must.
- Hands-on experience with SQL queries.
- Knowledge of Databricks, Dataflow or Apache Spark would be a plus.
- Knowledge of Prefect/MLFlow or similar products, ML/DL libraries, and experience in working on the deployment of ML/DL solutions would be also a plus.
- Seniority level: Mid-Senior level
- Employment type: Full-time
- Job function: Information Technology
- Competitive salaries and benefits
- Work and learn with industry professions
- Supportive and collaborative environment
- Unlimited opportunities for growth and development
#J-18808-Ljbffr
Data Engineer
Posted today
Job Viewed
Job Description
The Data Engineer will be responsible for developing semantic models on top of the Data Lake/Data Warehouse to fulfill the self-service BI foundation requirements. This includes data extraction from various data sources and integration into the central data lake/data warehouse using enterprise platforms like Informatica iPaaS.
Key Responsibilities of Data Engineer- Designing data warehouse data models based on business requirements.
- Designing, developing, and testing both batch and real-time Extract, Transform and Load (ETL) processes required for data integration.
- Ingesting both structured and unstructured data into the SMBU data lake/data warehouse system.
- Designing and developing semantic models/self-service cubes.
- Performing BI administration and access management to ensure access and reports are properly governed.
- Performing unit testing and data validation to ensure business UAT is successful.
- Performing ad-hoc data analysis and presenting results in a clear manner.
- Assessing data quality of the source systems and proposing enhancements to achieve a satisfactory level of data accuracy.
- Optimizing ETL processes to ensure execution time meets requirements.
- Maintaining and architecting ETL pipelines to ensure data is loaded on time on a regular basis.
- 5 to 8 years of overall experience.
- Proven experience in the development of dimensional models in Azure Synapse with strong SQL knowledge.
- Minimum of 3 years working as a Data Engineer in the Azure ecosystem specifically using Synapse, ADF & Data bricks.
- Preferably 3 years of experience with data warehousing, ETL development, SQL Queries, Synapse, ADF, PySpark, and Informatica iPaaS for data ingestion & data modeling.
Be The First To Know
About the latest Data engineer Jobs in Dubai !
Data Engineer
Posted today
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features.
We are seeking a skilled Data Engineer to design, implement, and support data integration solutions for our enterprise data warehouse. This role involves developing robust ETL pipelines, managing PL/SQL procedures, collaborating with business stakeholders, and ensuring the stability and performance of our data infrastructure. The ideal candidate will bring strong technical expertise in Informatica, PL/SQL, Python, and Linux/Unix environments, along with the ability to support mission-critical operations.
Key Responsibilities- ETL Pipeline Development: Design, build, and optimize ETL pipelines to load data from multiple source systems into the enterprise data warehouse.
- Stored Procedure Management: Create, optimize, and maintain PL/SQL stored procedures to support business needs and improve query performance.
- Informatica Development: Develop, troubleshoot, and optimize Informatica ETL workflows, including large-volume file processing and integrations.
- Ad-Hoc Data Requests: Support the Risk Department (Credit and Market Risk) by designing ETL solutions and delivering datasets as required.
- Collaboration: Work closely with Credit and Market Risk teams to gather requirements, implement changes, and streamline workflows.
- Scripting & Automation: Use Python for automation and file handling; develop and maintain Linux/Unix shell scripts for operational tasks.
- Server Administration: Perform routine server patching and maintenance for both Linux and Windows servers to ensure stability and security.
- Operational Support: Provide 24/7 production support for ETL pipelines, monitoring performance, and resolving issues promptly.
- Documentation: Create and maintain comprehensive documentation for ETL processes, workflows, and system configurations.
- Strong hands-on experience with Informatica PowerCenter (or similar ETL tools).
- Proficiency in PL/SQL development and performance tuning.
- Solid understanding of data warehouse concepts and data modeling .
- Experience managing large data volumes with performance optimization.
- Proficiency in Python for scripting and data handling.
- Strong knowledge of Linux/Unix environments , including shell scripting and administration.
- Familiarity with Windows and Linux server patching/maintenance .
- Ability to provide 24/7 production support in a mission-critical environment.
- Excellent problem-solving, troubleshooting, and communication skills.
- Experience working with Risk Management (Credit/Market Risk) teams is a plus.
- Mid-Senior level
- Full-time
- Information Technology
Data Engineer
Posted today
Job Viewed
Job Description
Bayut & dubizzle have the unique distinction of being iconic, homegrown brands with a strong presence across the seven emirates in the UAE. Connecting millions of users across the country, we are committed to delivering the best online search experience.
As part of Dubizzle Group, we are alongside some of the strongest classified brands in the market. With a collective strength of 8 brands, we have more than 160 million monthly users that trust in our dedication to providing them with the best platform for their needs.
The Data Engineer intern will be participating in exciting projects covering the end-to-end data lifecycle – from raw data integrations with primary and third-party systems, through advanced data modelling, to state-of-the-art data visualisation and development of innovative data products.
You will have the opportunity to learn how to build and work with both batch and real-time data processing pipelines. You will work in a modern cloud-based data warehousing environment alongside a team of diverse, intense and interesting co-workers. You will liaise with other departments – such as product & tech, the core business verticals, trust & safety, finance and others – to enable them to be successful.
Key Responsibilities Include:
- Raw data integrations with primary and third-party systems
- Data warehouse modelling for operational and application data layers
- Development in Amazon Redshift cluster
- SQL development as part of agile team workflow
- ETL design and implementation in Matillion ETL
- Design and implementation of data products enabling data-driven features or business solutions
- Data quality, system stability and security
- Coding standards in SQL, Python, ETL design
- Building data dashboards and advanced visualisations in Periscope Data with a focus on UX, simplicity and usability
- Working with other departments on data products – i.e. product & technology, marketing & growth, finance, core business, advertising and others
- Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data
Minimum Requirements:
- Bachelor's degree in computer science, engineering, math, physics or any related quantitative field.
- Knowledge of relational and dimensional data models
- Knowledge of terminal operations and Linux workflows
- Ability to communicate insights and findings to a non-technical audience
- Good SQL skills across a variety of relational data warehousing technologies especially in cloud data warehousing (e.g. Amazon Redshift, Google BigQuery, Snowflake, Vertica, etc.)
- Attention to details and analytical thinking
- Entrepreneurial spirit and ability to think creatively; highly-driven and self-motivated; strong curiosity and strive for continuous learning
- Ability to contribute to a platform used by more than 5M users in UAE and other platforms in the region.
Bayut & dubizzle is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
Tyde Digital, a trusted business optimization consultancy, is looking for a highly analytical and detail-oriented Data Engineer to join our rapidly growing team.
At Tyde, we collaborate with businesses across diverse industries and regions, helping them enhance four key areas of their operations : - people, processes, technology, and data . Our goal is to empower organizations with data driven decision-making, enabling them to achieve their strategic objectives.
As a Data Engineer, you'll play a pivotal role in designing, managing, and optimizing data infrastructure. You'll work closely with Business Analysts and Data Analysts to transform complex data into actionable insights, driving meaningful and measurable outcomes for our clients.
Who We're Looking For
Tyde Digital is home to problem-solvers, innovators, and strategic thinkers. If you're passionate about business transformation, thrive in a collaborative setting, and are driven by results, you'll fit right in.
- Adaptability in a fast-moving environment
- Proactive problem-solving mindset
- Strong analytical and data-driven decision-making skills
- Meticulous attention to detail
- Commitment to delivering exceptional value to clients
- True passion and pride in the work we deliver
Key Responsibilities
- Data Pipeline Development : Design and implement scalable and efficient architecture for data pipelines, ensuring seamless data flow across multiple endpoints.
- Technical Customizations : Deploy technical customizations to pre-packaged software solutions.
- Micro-service and API Development : Create and deploy micro-services and APIs to facilitate data transfer and integration between various systems.
- Data Transformation : Perform data transformation activities, including data cleansing, data mining, and data processing, utilizing a variety of technologies and tools.
- C ontinuous Improvement : Collaborate with cross-functional teams to drive changes and enhancements based on data-driven findings.
- Data Mapping : Identify and present key performance indicators (KPIs) based on existing data points and datasets.
- Solution Development : Design and implement end-to-end data-driven solutions tailored to business requirements, integrating tools, platforms, and custom logic to address specific operational challenges.
Involvement In
- Collecting Requirements from Clients : Identify and present key gaps and opportunities for business optimization from the client.
- Creating Data-Driven Insights : Provide valuable insights derived from data analysis and contribute to strategic decision-making processes.
- Reporting and Visualization : Develop reporting tools using a range of Business Intelligence (BI) tools such as Power BI, Tableau, QlikView, and others.
- KPI Identification : Identify and present key performance indicators (KPIs) based on existing data points and datasets.
Requirements
- Bachelor's degree in a relevant field (e.g., Computer Science, Data Science, Information Technology) or equivalent experience.
- 2+ years of experience in data engineering or a related role.
- Proven experience in data engineering or a related role.
- Strong programming skills in languages such as Python, C#, or Java.
- Experience in extracting applicable data from databases using SQL.
- Experience in cloud computing platforms (AWS / GCP / Azure).
- Experience with BI tools like Power BI, Tableau, or similar platforms.
- Strong problem-solving abilities and attention to detail.
- Ability to work collaboratively in a cross-functional team environment.
- Knowledge of data governance and security best practices is a plus.
- Excellent verbal and written communication skills.
- Excellent data visualization and communication skills.
- Experience with Power Platform Development applications.
- Experience with custom SharePoint applications and workflows.
We are in a critical phase of growth, which brings incredible opportunities but also unique challenges. Our projects span multiple geographies, often requiring travel and periods of time abroad to ensure seamless delivery and deep client engagement. At the same time, our biggest project isn't just client-facing it's also internal. We are constantly refining and improving our own systems and processes to ensure that as we scale, we remain agile and effective.
The pace is fast, and so adaptability is key. However, the rewards of joining at this early stage are immense your work will have a direct impact on the company's future, and you'll be in a position to shape the culture, operations, and strategic direction of a business that's on a rapid growth trajectory.
Why Join Us
- Client-First Philosophy : We prioritize the goals of our clients, offering a fresh alternative to traditional consultancy approaches. No legacy-minded advice or pushy sales tactics just honest, results-driven solutions.
- Technology Agnostic Excellence : We're not here to sell software. We help businesses maximize their existing technology investments and identify the most suitable new tools when necessary.
- Data-Driven Innovation : We enable businesses to turn accurate, accessible, and actionable data into real-time insights, driving tangible business results.
- Sustainable Client Growth : Our solutions aren't just about solving today's challenges they're about setting the foundation for tomorrow's innovation and success.
- Career Advancement : Joining the business at this phase of growth provides opportunity for accelerated career development working closely with the founders.