81 Data Driven Products jobs in Dubai
Data Engineering Specialist
Posted today
Job Viewed
Job Description
**Job Title:** Data Engineering Specialist
The Role
As a Data Engineering Specialist, you will play a pivotal role in transforming complex data landscapes into actionable business insights. Your expertise will enable you to rapidly convert diverse new data sources into datasets that support our overarching business intelligence and analytics goals.
Main Responsibilities:
- Evaluate new data sources from SAP and Non-SAP systems
- Rapidly convert these data sources into datasets
- Develop and support the integration of new data streams into our Azure Data platform
- Ensure data is accessible, reliable, and structured for business insight generation
Required Skills and Qualifications:
- Expertise in data engineering and technical integration
- Ability to evaluate diverse data sources and convert them into usable datasets
- Familiarity with Azure Data platform and its features
- Excellent analytical and problem-solving skills
Benefits:
You will directly impact our ability to make data-driven decisions and maintain our competitive edge in the marketplace. Your contributions will be invaluable in driving business growth and success.
Others:
We offer a dynamic work environment and opportunities for professional growth and development.
Data Engineering Specialist
Posted today
Job Viewed
Job Description
Data Engineering Specialist
Job Overview:A seasoned professional in data engineering with a strong background in designing and implementing end-to-end data pipelines using cloud platforms.
Key Responsibilities:- Design, develop, and optimize Extract, Transform, Load (ETL) workflows for effective migration and transformation.
- Maintain data ingestion processes from diverse sources such as Salesforce, Oracle databases, PostgreSQL, and MySQL.
- Implement and maintain ETL processes and data pipelines to ensure efficient data extraction, transformation, and loading.
- Leverage Snowflake as the data warehouse solution for managing large volumes of structured and unstructured data.
This is an opportunity to work on complex data engineering projects, collaborating with cross-functional teams to deliver high-quality solutions.
Data Engineering Lead
Posted today
Job Viewed
Job Description
Shape the Future of Data Engineering
We're seeking a seasoned data engineering expert to lead our efforts in building a cutting-edge data foundation. As a key member of our team, you'll have the opportunity to design and implement scalable data systems that support real-time and batch analytics and operations.
- Data Architecture Expertise : You'll be responsible for architecting data systems that ensure data quality, speed, and scalability, using GCP, Airflow, BigQuery, dbt, and Dataflow.
- Pipeline Development : You'll build end-to-end pipelines to deliver high-quality data products, working closely with cross-functional teams to ensure seamless collaboration.
- Data Governance : You'll establish robust data governance and quality rules to promote transparency, trust, and compliance throughout the organization.
- Leadership and Collaboration : As a leader, you'll work with teams to drive impactful data products and initiatives, fostering a culture of innovation and excellence.
This is an exciting opportunity to join a fast-scaling company and make a meaningful impact on its growth and success. If you're passionate about data engineering and want to take your skills to the next level, we encourage you to apply.
Requirements:
- GCP Expertise : Proven experience with Google Cloud Platform, including BigQuery, Dataflow, Airflow, dbt, and Python.
- Advanced SQL Skills : Strong expertise in advanced SQL querying and performance tuning.
- Collaboration and Leadership : Ability to work effectively with cross-functional teams and lead by example.
What We Offer:
- Autonomy and Ownership : You'll have the freedom to drive strategy and make decisions that impact the company's growth and success.
- Continuous Learning : We prioritize professional development and offer opportunities for growth and learning.
About Us:
We're a dynamic and forward-thinking organization dedicated to delivering innovative solutions and driving business growth. Our mission is to empower our customers with cutting-edge technology and expertise.
Data Engineering Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Engineer to join our team in the UAE.
Key Responsibilities- Design and develop scalable data models using various techniques such as star schema and snowflake schema.
- Develop and maintain database stored procedures and query language scripts using PL/SQL, Java, and SQL.
- Utilize ETL tools like Informatica and Microsoft SSIS to extract, transform, and load data.
- Work with business intelligence tools such as SAP Business Objects and Microsoft Power BI to create interactive dashboards.
- Have hands-on experience with core banking systems, preferably Finacle HPS or Power Card.
- Strong analytical and problem-solving skills.
- Excellent written and verbal communication skills in English.
- Ability to collaborate effectively within a team environment.
- Understanding of the UAE Banking Regulatory Framework Submissions (BRF).
- Familiarity with the Central Bank of the UAE's Supervisory Technology initiative and its objectives.
- Experience in automating regulatory reporting processes.
The selected candidate will be expected to work full-time on site. The joining time frame is within two weeks.
Terms and Conditions:
The employee must adhere to all company policies and procedures.
Remote Work:
No
Employment Type:
Full-time
Data Engineering Expert
Posted today
Job Viewed
Job Description
Description:
We are seeking an experienced Data Engineering Expert to lead the development of our big data architecture. The successful candidate will design, implement, and maintain scalable and secure data pipelines using Spark, Hive, and Python on Cloudera.
Key Responsibilities:
- Develop real-time data workflows using Kafka.
- Design and develop APIs for data access and integration.
- Utilize Hue, Oozie, and other Cloudera tools for job orchestration and data access.
- Deploy solutions on cloud platforms such as AWS, Azure, or GCP.
Requirements:
- Over 7 years of experience in Big Data engineering.
- Hands-on experience with Cloudera Spark, Hive, Kafka, Python, Hue, and Ranger.
- Strong understanding of distributed systems and cloud data services.
- Proficient in API development and data security controls.
About this Opportunity:
This is a unique opportunity to join a rapidly growing IT services company with a presence in multiple countries, working with top clients in various sectors. Our team values teamwork, quality of life, and professional growth, offering a global community of professionals committed to your development.
What We Offer:
- A competitive salary and benefits package.
- Ongoing training and development opportunities.
- A dynamic and supportive work environment.
- The chance to work on exciting projects with cutting-edge technologies.
How to Apply:
If you are a motivated and experienced data engineer looking for a new challenge, please submit your application, including your resume and a cover letter outlining your qualifications and experience.
Head of Data Engineering
Posted today
Job Viewed
Job Description
As the Head of Data Engineering, you will be responsible for designing, implementing, and maintaining a robust, scalable, and compliant data architecture that supports our exchange’s operations, analytics, and regulatory reporting requirements. You will lead a team of data engineers, ensuring high availability, security, and performance of our data infrastructure. You will work closely with stakeholders across technology, compliance, risk, and product teams to develop data pipelines, warehouses, and real-time analytics capabilities.
This is a strategic yet hands-on role where you will drive data engineering best practices, scalability, and automation while ensuring compliance with regulatory data requirements for a financial services entity.
Key ResponsibilitiesData Architecture & Strategy :
- Define and own the data architecture strategy for the exchange, ensuring it is scalable, secure, and regulatory-compliant.
- Design and implement data modeling, governance, and security frameworks to meet financial and regulatory requirements.
- Architect real-time and batch processing data pipelines to handle trade data, order books, user activity, and market analytics.
- Optimize data storage and retrieval for performance and cost efficiency, leveraging cloud-based and hybrid solutions.
- Build and maintain ETL / ELT data pipelines for operational, analytical, and compliance-related data.
- Ensure high data quality, reliability, and availability through robust monitoring, alerting, and data validation techniques.
- Manage and enhance data warehouses, data lakes, and streaming platforms to support business intelligence and machine learning use cases.
- Oversee database design and optimization for transactional and analytical workloads (e.g., Aurora, Redis, Kafka).
- Implement data lineage, metadata management, and access control mechanisms in line with compliance requirements.
Compliance, Security & Risk Management :
- Work closely with compliance and risk teams to ensure data retention policies, audit trails, and reporting mechanisms meet regulatory requirements (e.g., FATF, AML, GDPR, MiCA).
- Implement encryption, anonymization, and access control policies to safeguard sensitive user and transaction data.
- Support fraud detection and risk monitoring through data analytics and alerting frameworks.
Leadership & Team Management :
- Lead, mentor, and grow a small team of data engineers, fostering a culture of collaboration, innovation, and accountability.
- Drive best practices in data engineering, DevOps for data, and CI / CD automation for analytics infrastructure.
- Collaborate with software engineers, DevOps, data analysis, and product teams to integrate data solutions into the broader exchange ecosystem.
Technical competencies and skills :
- Proven experience in data architecture and engineering, preferably within a regulated financial or crypto environment.
- Strong proficiency in SQL, Python, or Scala for data engineering.
- Experience with cloud-based data platforms (AWS, GCP, or Azure) and orchestration tools (Airflow, Prefect, Dagster).
- Hands-on experience with real-time data processing (Kafka, Pulsar, Flink, Spark Streaming).
- Expertise in data warehousing solutions (Snowflake, BigQuery, Redshift, Databricks).
- Strong understanding of database design, indexing strategies, and query optimization.
- Experience implementing data governance, lineage, and cataloging tools.
- Familiarity with blockchain / crypto data structures and APIs is a plus.
Leadership & Strategic Skills :
- Experience leading and mentoring a team of data engineers.
- Ability to design data strategies that align with business goals and regulatory requirements.
- Strong cross-functional collaboration skills with compliance, risk, and technology teams.
- Ability to work in a fast-paced, high-growth startup environment with a hands-on approach.
Industry & Compliance Knowledge :
- Experience in regulated financial markets, fintech, or crypto is highly preferred.
- Familiarity with financial data standards, KYC / AML reporting, and regulatory requirements related to data handling.
Preferred Qualifications :
- Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.
- Certifications in cloud data engineering (AWS / GCP / Azure), data governance, or security are a plus.
- Experience working in a crypto exchange, trading platform, or high-frequency trading environment is an advantage.
At M2, we believe in a workplace where talent, dedication, and passion are the only factors that count, regardless of gender, background, age, and other characteristics. We embrace diversity because we know that it fuels innovation, fosters creativity, and drives success. So, if you're ready to join a team where your potential is truly valued, welcome aboard!
#J-18808-LjbffrChief Data Engineering Specialist
Posted today
Job Viewed
Job Description
Job Opportunity:
A seasoned data engineer with 5 years of experience in designing and maintaining data pipelines using Python. Expertise includes working with Python, SQL databases, Snowflake, Apache Airflow, and Shell scripting.
Main Responsibilities:
- Design, develop, and maintain robust data pipelines to support business operations and analytics using Python.
- Integrate APIs to fetch, process, and transform data for various use cases.
- Work with multiple SQL databases to perform data transformations, migrations, and optimizations.
- BUILD and manage data warehouses using Snowflake ensuring high performance and scalability.
- Implement and manage workflows to automate data pipeline orchestration.
- Develop and maintain Shell scripts to automate tasks and streamline processes.
- Ensure data quality, consistency, and security across all platforms.
- Collaborate with stakeholders to understand data requirements and deliver solutions.
- Monitor, troubleshoot, and optimize data pipelines for efficiency and reliability.
Required Skills and Qualifications:
- Programming: Strong proficiency in Python with experience in API integration and data processing.
- Data Warehousing: Hands-on experience with Snowflake including schema design, data modeling, and performance optimization.
- Workflow Automation: Proficiency in Apache Airflow for managing and orchestrating workflows.
- Scripting: Strong skills in Shell scripting for automation and system management.
- ETL/ELT Processes: Solid understanding of designing and maintaining ETL/ELT pipelines.
- Problem Solving: Strong analytical and problem-solving skills to handle complex data challenges.
- Communication: Excellent communication skills to work effectively with cross-functional teams.
Be The First To Know
About the latest Data driven products Jobs in Dubai !
Head of Data Engineering
Posted 5 days ago
Job Viewed
Job Description
As the Head of Data Engineering, you will be responsible for designing, implementing, and maintaining a robust, scalable, and compliant data architecture that supports our exchange’s operations, analytics, and regulatory reporting requirements. You will lead a team of data engineers, ensuring high availability, security, and performance of our data infrastructure. You will work closely with stakeholders across technology, compliance, risk, and product teams to develop data pipelines, warehouses, and real-time analytics capabilities.
This is a strategic yet hands-on role where you will drive data engineering best practices, scalability, and automation while ensuring compliance with regulatory data requirements for a financial services entity.
Key ResponsibilitiesData Architecture & Strategy :
- Define and own the data architecture strategy for the exchange, ensuring it is scalable, secure, and regulatory-compliant.
- Design and implement data modeling, governance, and security frameworks to meet financial and regulatory requirements.
- Architect real-time and batch processing data pipelines to handle trade data, order books, user activity, and market analytics.
- Optimize data storage and retrieval for performance and cost efficiency, leveraging cloud-based and hybrid solutions.
- Build and maintain ETL / ELT data pipelines for operational, analytical, and compliance-related data.
- Ensure high data quality, reliability, and availability through robust monitoring, alerting, and data validation techniques.
- Manage and enhance data warehouses, data lakes, and streaming platforms to support business intelligence and machine learning use cases.
- Oversee database design and optimization for transactional and analytical workloads (e.g., Aurora, Redis, Kafka).
- Implement data lineage, metadata management, and access control mechanisms in line with compliance requirements.
Compliance, Security & Risk Management :
- Work closely with compliance and risk teams to ensure data retention policies, audit trails, and reporting mechanisms meet regulatory requirements (e.g., FATF, AML, GDPR, MiCA).
- Implement encryption, anonymization, and access control policies to safeguard sensitive user and transaction data.
- Support fraud detection and risk monitoring through data analytics and alerting frameworks.
Leadership & Team Management :
- Lead, mentor, and grow a small team of data engineers, fostering a culture of collaboration, innovation, and accountability.
- Drive best practices in data engineering, DevOps for data, and CI / CD automation for analytics infrastructure.
- Collaborate with software engineers, DevOps, data analysis, and product teams to integrate data solutions into the broader exchange ecosystem.
Technical competencies and skills :
- Proven experience in data architecture and engineering, preferably within a regulated financial or crypto environment.
- Strong proficiency in SQL, Python, or Scala for data engineering.
- Experience with cloud-based data platforms (AWS, GCP, or Azure) and orchestration tools (Airflow, Prefect, Dagster).
- Hands-on experience with real-time data processing (Kafka, Pulsar, Flink, Spark Streaming).
- Expertise in data warehousing solutions (Snowflake, BigQuery, Redshift, Databricks).
- Strong understanding of database design, indexing strategies, and query optimization.
- Experience implementing data governance, lineage, and cataloging tools.
- Familiarity with blockchain / crypto data structures and APIs is a plus.
Leadership & Strategic Skills :
- Experience leading and mentoring a team of data engineers.
- Ability to design data strategies that align with business goals and regulatory requirements.
- Strong cross-functional collaboration skills with compliance, risk, and technology teams.
- Ability to work in a fast-paced, high-growth startup environment with a hands-on approach.
Industry & Compliance Knowledge :
- Experience in regulated financial markets, fintech, or crypto is highly preferred.
- Familiarity with financial data standards, KYC / AML reporting, and regulatory requirements related to data handling.
Preferred Qualifications :
- Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.
- Certifications in cloud data engineering (AWS / GCP / Azure), data governance, or security are a plus.
- Experience working in a crypto exchange, trading platform, or high-frequency trading environment is an advantage.
At M2, we believe in a workplace where talent, dedication, and passion are the only factors that count, regardless of gender, background, age, and other characteristics. We embrace diversity because we know that it fuels innovation, fosters creativity, and drives success. So, if you're ready to join a team where your potential is truly valued, welcome aboard!
#J-18808-LjbffrData Engineering – Subject Matter Expert
Posted today
Job Viewed
Job Description
Data Engineering – Subject Matter Expert (Dubai Hybrid)
About the role
We are seeking a seasoned Data Engineering SME with strong experience in data platforms, ETL tools, and cloud technologies. The ideal candidate will lead the design and implementation of enterprise-scale data solutions, provide strategic guidance on data architecture, and play a key role in data migration, data quality, and performance tuning initiatives. This role demands a mix of deep technical expertise, project management, and stakeholder communication.
Key Responsibilities
- Lead the design, development, and deployment of robust, scalable ETL pipelines and data solutions.
- Provide technical leadership and SME support for data engineering teams across multiple projects.
- Collaborate with cross-functional teams including Data Analysts, BI Developers, Product Owners, and IT to gather requirements and deliver data products.
- Design and optimize data workflows using tools such as IBM DataStage, Talend, Informatica, and Databricks.
- Implement data integration solutions for structured and unstructured data across on-premise and cloud platforms.
- Conduct performance tuning and optimization of ETL jobs and SQL queries.
- Oversee data quality checks, data governance compliance, and PII data protection strategies.
- Support and mentor team members on data engineering best practices and agile methodologies.
- Analyze and resolve production issues in a timely manner.
- Contribute to enterprise-wide data transformation strategies including legacy-to-digital migration using Spark, Hadoop, and cloud platforms.
- Manage stakeholder communications and provide regular status reports.
Required Skills and Qualifications
- Bachelor's degree in Engineering, Computer Science, or a related field (MTech in Data Science is a plus).
- 8+ years of hands-on experience in ETL development and data engineering.
- Strong proficiency with tools: IBM DataStage, Talend, Informatica, Databricks, Power BI, Tableau.
- Strong SQL, PL/I, Python, and Unix Shell scripting skills.
- Experience with cloud platforms like AWS and modern big data tools like Hadoop, Spark.
- Solid understanding of data warehousing, data modeling, and data migration practices.
- Experience working in Agile/Scrum environments.
- Excellent problem-solving, communication, and team collaboration skills.
- Scrum Master or Product Owner certifications (CSM, CSPO) are a plus.
Upload your resume and our tool will compare it to the requirements for this job like recruiters do.
It has come to our attention that clients and candidates are being contacted by individuals fraudulently posing as Antal representatives. If you receive a suspicious message (by email or WhatsApp), please do not click on any links or attachments. We never ask for credit card or bank details to purchase materials, and we do not charge fees to jobseekers.
#J-18808-LjbffrBig Data Engineering Leadership Role
Posted today
Job Viewed
Job Description
Big Data Engineering Leadership Role
As a senior big data engineer, you will be responsible for leading the development of complex distributed systems and big data technologies.
- Design, develop, and deploy scalable big data architectures using Hadoop and Spark.
- Lead a team of engineers to design, implement, and maintain big data solutions across multiple industries.
- Develop and maintain knowledge in big data querying tools such as Pig, Hive, and Impala.
- Collaborate with cross-functional teams to design and implement big data analytics solutions.
Required Skills and Qualifications
- 10+ years of experience as a big data engineer.
- Expertise in Hadoop (Cloudera), Spark, and similar frameworks.
- Good knowledge of big data querying tools such as Pig, Hive, and Impala.
- Knowledge of scripting languages including Java, C++, Linux, Ruby, PHP, Python, and R.
- Excellent leadership and communication skills.
- Ability to solve complex networking, data, and software issues.
- Able to effectively plan and organize their work.
- Strong interpersonal communication.
- Assist others in the completion of their tasks to support group goals.
Benefits
This role offers a challenging opportunity for a seasoned big data engineer to grow professionally and contribute to the success of our organization. You will have the chance to work on high-profile projects, collaborate with top talent, and receive competitive compensation and benefits.
Others
Our organization is committed to diversity, equity, and inclusion. We welcome applications from individuals who share our values and are passionate about making a difference in the field of big data engineering.