What Jobs are available for Cloud Data Engineering in the United Arab Emirates?
Showing 73 Cloud Data Engineering jobs in the United Arab Emirates
Master Data Management Lead
Posted today
Job Viewed
Job Description
Join us at Enquo, where we're dedicated to harnessing the transformative power of data and technology. As leaders in technology and data solutions, we prioritize humanity in everything we do. Our mission is clear: to empower organizations to unlock the full potential of their data through cutting-edge technology and exceptional services.
We envision a brighter future, where technology ignites extraordinary achievements and drives profound transformation. Here at Enquo, challenges are opportunities, and our passionate team thrives on making meaningful impacts on society. With humility and a collaborative spirit, we leverage teamwork and creative thinking to deliver optimal and trustworthy solutions.
Asa purpose-driven company, were passionate about using data and technology as catalysts for positive change. Our vision extends to a world where everyone can harness the power of data to reach their fullest potential. At Enquo, honesty, trust, and empathy form the foundation of our simple business language. Were agile, adaptable, and committed to bridging any business need with innovative data solutions.
Join our journey, where curiosity and entrepreneurship drive us to explore uncharted territories and create solutions that truly matter. We foster a collaborative and inclusive environment, valuing every team member's contributions. If you're a talented, curious, and creative individual who thrives in a fast-paced, dynamic setting, we invite you to be part of our mission. Together, let's create new opportunities through data and technology, shaping a more humane future for all.
Enquo: fueling a better future through innovation, data, and technology.
Role DescriptionThe Master Data Management Lead will be responsible for defining, designing, and building dimensional databases to meet business needs. Assisting in the application and implementation procedures of data standards and guidelines coding structures and data replication to ensure access to and integrity of data sets.
Key Responsibilities- Excellent experience in Master Data Management including include Meta-Data Management, Data Migration, Data Security and Data Transformation/Conversion
- Experience in ETL processes and advanced SQL skills.
- Intermediate Requirements Gathering/Elicitation, Documentation, and Source to Target mapping skills.
- Working knowledge of Conceptual, Logical and Physical Data Modeling concepts as well as Database design concepts
- Practical experience working in an Agile Methodology
- Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
- Proven experience in data quality management, with at least 8 years of experience in a leadership role.
- Strong understanding of data quality frameworks, tools, and methodologies.
- Proficiency in SQL and experience working with data profiling tools.
- Excellent analytical and problem-solving skills.
- Leadership and team management abilities.
- Effective communication and collaboration skills.
- Familiarity with data governance principles is a plus.
Is this job a match or a miss?
Senior Data Management Consultant, Customer Experience
Posted today
Job Viewed
Job Description
Is this job a match or a miss?
Manager - TSG - Data Management for UAE Nationals
Posted 129 days ago
Job Viewed
Job Description
Is this job a match or a miss?
Software Back-end Engineer - segments cloud computing llc
Posted today
Job Viewed
Job Description
Job Title: Backend Developer
Company: Segments Cloud Computing LLC
Experience Level: Minimum 2 years
Location: Dubai, AE
Salary Range: AED 3,000 - AED 3,500
Job Description: We are seeking a talented Backend Developer to join our team. As a Backend Developer, you will be responsible for building and maintaining the server-side logic, databases, and APIs. You will work closely with the frontend developers and other stakeholders to deliver high-performance, secure, and scalable backend solutions.
Key Responsibilities:
- Design, build, and maintain backend systems and APIs for web applications.
- Work with databases to ensure data integrity, scalability, and security.
- Collaborate with frontend developers to integrate user-facing elements with server-side logic.
- Optimize the application for maximum speed and scalability.
- Implement security and data protection measures.
- Debug and troubleshoot server-related issues.
- Write clean, reusable, and efficient code.
- Participate in code reviews and collaborate with the development team.
Qualifications:
- Bachelor's degree in Computer Science, Software Engineering, or a related field.
- Minimum of 2 years of experience as a Backend Developer.
- Proficiency in backend languages such as Node.js, Python, Java, or Ruby on Rails.
- Experience with database technologies such as MySQL, PostgreSQL, MongoDB, or other NoSQL databases.
- Familiarity with RESTful and GraphQL APIs.
- Experience with cloud services like AWS, Google Cloud, or Azure.
- Knowledge of version control systems like Git.
- Strong understanding of security, performance optimization, and scalability principles.
Bonus Skills:
- Experience with microservices architecture.
- Familiarity with containerization tools like Docker or Kubernetes.
- Knowledge of DevOps practices.
- Experience with caching technologies like Redis or Memcached.
Benefits:
- Competitive salary.
- Health insurance and other benefits.
- Opportunity for professional development and career growth.
- A collaborative and innovative work environment.
How to Apply: If you are excited about this opportunity and meet the qualifications, please submit your resume to confidential or apply through Indeed.
#J-18808-LjbffrIs this job a match or a miss?
Head of Data Engineering
Posted today
Job Viewed
Job Description
As the Head of Data Engineering, you will be responsible for designing, implementing, and maintaining a robust, scalable, and compliant data architecture that supports our exchange’s operations, analytics, and regulatory reporting requirements. You will lead a team of data engineers, ensuring high availability, security, and performance of our data infrastructure. You will work closely with stakeholders across technology, compliance, risk, and product teams to develop data pipelines, warehouses, and real-time analytics capabilities.
This is a strategic yet hands-on role where you will drive data engineering best practices, scalability, and automation while ensuring compliance with regulatory data requirements for a financial services entity.
Key ResponsibilitiesData Architecture & Strategy :
- Define and own the data architecture strategy for the exchange, ensuring it is scalable, secure, and regulatory-compliant.
- Design and implement data modeling, governance, and security frameworks to meet financial and regulatory requirements.
- Architect real-time and batch processing data pipelines to handle trade data, order books, user activity, and market analytics.
- Optimize data storage and retrieval for performance and cost efficiency, leveraging cloud-based and hybrid solutions.
- Build and maintain ETL / ELT data pipelines for operational, analytical, and compliance-related data.
- Ensure high data quality, reliability, and availability through robust monitoring, alerting, and data validation techniques.
- Manage and enhance data warehouses, data lakes, and streaming platforms to support business intelligence and machine learning use cases.
- Oversee database design and optimization for transactional and analytical workloads (e.g., Aurora, Redis, Kafka).
- Implement data lineage, metadata management, and access control mechanisms in line with compliance requirements.
Compliance, Security & Risk Management :
- Work closely with compliance and risk teams to ensure data retention policies, audit trails, and reporting mechanisms meet regulatory requirements (e.g., FATF, AML, GDPR, MiCA).
- Implement encryption, anonymization, and access control policies to safeguard sensitive user and transaction data.
- Support fraud detection and risk monitoring through data analytics and alerting frameworks.
Leadership & Team Management :
- Lead, mentor, and grow a small team of data engineers, fostering a culture of collaboration, innovation, and accountability.
- Drive best practices in data engineering, DevOps for data, and CI / CD automation for analytics infrastructure.
- Collaborate with software engineers, DevOps, data analysis, and product teams to integrate data solutions into the broader exchange ecosystem.
Technical competencies and skills :
- Proven experience in data architecture and engineering, preferably within a regulated financial or crypto environment.
- Strong proficiency in SQL, Python, or Scala for data engineering.
- Experience with cloud-based data platforms (AWS, GCP, or Azure) and orchestration tools (Airflow, Prefect, Dagster).
- Hands-on experience with real-time data processing (Kafka, Pulsar, Flink, Spark Streaming).
- Expertise in data warehousing solutions (Snowflake, BigQuery, Redshift, Databricks).
- Strong understanding of database design, indexing strategies, and query optimization.
- Experience implementing data governance, lineage, and cataloging tools.
- Familiarity with blockchain / crypto data structures and APIs is a plus.
Leadership & Strategic Skills :
- Experience leading and mentoring a team of data engineers.
- Ability to design data strategies that align with business goals and regulatory requirements.
- Strong cross-functional collaboration skills with compliance, risk, and technology teams.
- Ability to work in a fast-paced, high-growth startup environment with a hands-on approach.
Industry & Compliance Knowledge :
- Experience in regulated financial markets, fintech, or crypto is highly preferred.
- Familiarity with financial data standards, KYC / AML reporting, and regulatory requirements related to data handling.
Preferred Qualifications :
- Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.
- Certifications in cloud data engineering (AWS / GCP / Azure), data governance, or security are a plus.
- Experience working in a crypto exchange, trading platform, or high-frequency trading environment is an advantage.
At M2, we believe in a workplace where talent, dedication, and passion are the only factors that count, regardless of gender, background, age, and other characteristics. We embrace diversity because we know that it fuels innovation, fosters creativity, and drives success. So, if you're ready to join a team where your potential is truly valued, welcome aboard!
#J-18808-LjbffrIs this job a match or a miss?
OCI Cloud Engineer
Posted today
Job Viewed
Job Description
We are seeking a highly skilled and experienced OCI Cloud Engineer to join our team in Dubai, UAE. In this role, you will be responsible for designing, implementing, and managing cloud-based solutions using Oracle Cloud Infrastructure (OCI). The ideal candidate will have a strong background in cloud technologies and a passion for leveraging cloud services to drive innovation and efficiency.
Key Responsibilities of OCI Cloud Engineer- At least 3+ years of experience working with Oracle Cloud Infrastructure (OCI), including core services like Compute, Storage, Networking, and IAM.
- Strong understanding of OCI networking components, including VCNs, Security Lists, NSGs, Load Balancers, and Gateways.
- Advanced knowledge of Terraform, including writing complex reusable modules, managing state files, and troubleshooting IaC deployments.
- Experience integrating Terraform with CI/CD pipelines for automated deployments.
- Hands-on experience creating and managing Ansible playbooks, roles, and inventories.
- Proficiency in using Ansible for configuration management, patching, and software deployment in cloud environments.
- Deep understanding of TCP/IP, DNS, BGP, VPNs, and hybrid connectivity solutions such as FastConnect.
- Knowledge of cloud security best practices, including IAM policies, encryption, and vulnerability management.
- Experience in scripting with Python, Bash, or PowerShell for custom automation tasks.
- Bachelor's Degree in Computer Science, Information Technology, or a related field.
- Minimum of 3 years of experience working with cloud technologies, preferably OCI.
- Strong understanding of cloud architecture, security, and best practices.
- Proficiency in scripting and automation tools.
- Excellent problem-solving and analytical skills.
- Strong communication and interpersonal skills.
- Ability to work effectively in a fast-paced and dynamic environment.
Is this job a match or a miss?
Data Engineer
Posted today
Job Viewed
Job Description
Property Monitor is the UAE’s leading real estate technology and market intelligence platform, recently acquired by Dubizzle Group. At Property Monitor, we empower developers, brokers, investors, and property professionals with authoritative data and powerful analytics, enabling them to make faster, smarter, and more informed decisions.
As part of Dubizzle Group, we are alongside five powerhouse brands - including market-leading platforms like Bayut and dubizzle trusted by over 123 million monthly users. Together, these brands shape how people buy, sell, and connect across real estate, classifieds, and services in the UAE and broader region.
The Data Engineer will help deliver world-class big data solutions and drive impact for the dubizzle business. You will be responsible for exciting projects covering the end-to-end data life cycle – from raw data integrations with primary and third-party systems, through advanced data modeling, to state-of-the-art data visualization and development of innovative data products.
You will have the opportunity to build and work with both batch and real-time data processing pipelines. While working in a modern cloud-based data warehousing environment alongside a team of diverse, intense and interesting co-workers, you will liaise with other teams– such as product & tech, the core business verticals, trust & safety, finance and others – to enable them to be successful.
In this role, you will:
- Raw data integrations with primary and third-party systems
- Data warehouse modelling for operational and application data layers
- Development in Amazon Redshift cluster
- SQL development as part of agile team workflow
- ETL design and implementation in Matillion ETL
- Real-time data pipelines and applications using serverless and managed AWS services such as Lambda, Kinesis, API Gateway, etc.
- Design and implementation of data products enabling data-driven features or business solutions
- Data quality, system stability and security
- Coding standards in SQL, Python, ETL design
- Building data dashboards and advanced visualisations in Periscope Data with a focus on UX, simplicity and usability
- Working with other departments on data products – i.e. product & technology, marketing & growth, finance, core business, advertising and others
- Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data
- Be able to work autonomously without supervision on complex projects
- Participate in the early morning ETL status check rota
Requirements:
- Top of class technical degree such as computer science, engineering, math, physics.
- 3+ years of experience working with customer-centric data at big data-scale, preferably in an online / e-commerce context
- 2+ years of experience with one or more programming languages, especially Python
- Strong track record in business intelligence solutions, building and scaling data warehouses and data modelling
- Experience with modern big data ETL tools is a plus (e.g. Matillion)
- Experience with AWS data ecosystem (or other cloud providers)
- Experience with modern data visualization platforms such as Sisense (formerly Periscope Data), Google Data Studio, Tableau, MS Power BI etc.
- Knowledge of modern real-time data pipelines is a strong plus (e.g. server less framework, lambda, kinesis, etc.)
- Knowledge or relational relational and dimensional data models
- Knowledge of terminal operations and Linux workflows
- World-class SQL skills across a variety of relational data warehousing technologies especially in cloud data warehousing (e.g. Amazon Redshift, Google BigQuery, Snowflake, Vertica, etc.)
- Ability to communicate insights and findings to a non-technical audience
What We Offer:
- A fast paced, high performing team.
- Multicultural environment with over 50 different nationalities
- Competitive Tax-free Salary
- Comprehensive Health Insurance
- Annual Air Ticket Allowance
- Employee discounts at multiple vendors across the emirates
- Rewards & Recognitions
- Learning & Development
Dubizzle Group is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
#J-18808-LjbffrIs this job a match or a miss?
Be The First To Know
About the latest Cloud data engineering Jobs in United Arab Emirates !
Data Engineer
Posted today
Job Viewed
Job Description
Property Monitor is the UAE’s leading real estate technology and market intelligence platform, recently acquired by Dubizzle Group. At Property Monitor, we empower developers, brokers, investors, and property professionals with authoritative data and powerful analytics, enabling them to make faster, smarter, and more informed decisions.
As part of Dubizzle Group, we are alongside five powerhouse brands - including market-leading platforms like Bayut and dubizzle trusted by over 123 million monthly users. Together, these brands shape how people buy, sell, and connect across real estate, classifieds, and services in the UAE and broader region.
The Data Engineer will help deliver world-class big data solutions and drive impact for the dubizzle business. You will be responsible for exciting projects covering the end-to-end data life cycle – from raw data integrations with primary and third-party systems, through advanced data modeling, to state-of-the-art data visualization and development of innovative data products.
You will have the opportunity to build and work with both batch and real-time data processing pipelines. While working in a modern cloud-based data warehousing environment alongside a team of diverse, intense and interesting co-workers, you will liaise with other teams– such as product & tech, the core business verticals, trust & safety, finance and others – to enable them to be successful.
In this role, you will:
- Raw data integrations with primary and third-party systems
- Data warehouse modelling for operational and application data layers
- Development in Amazon Redshift cluster
- SQL development as part of agile team workflow
- ETL design and implementation in Matillion ETL
- Real-time data pipelines and applications using serverless and managed AWS services such as Lambda, Kinesis, API Gateway, etc.
- Design and implementation of data products enabling data-driven features or business solutions
- Data quality, system stability and security
- Coding standards in SQL, Python, ETL design
- Building data dashboards and advanced visualisations in Periscope Data with a focus on UX, simplicity and usability
- Working with other departments on data products – i.e. product & technology, marketing & growth, finance, core business, advertising and others
- Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data
- Be able to work autonomously without supervision on complex projects
- Participate in the early morning ETL status check rota
Requirements:
- Top of class technical degree such as computer science, engineering, math, physics.
- 3+ years of experience working with customer-centric data at big data-scale, preferably in an online / e-commerce context
- 2+ years of experience with one or more programming languages, especially Python
- Strong track record in business intelligence solutions, building and scaling data warehouses and data modelling
- Experience with modern big data ETL tools is a plus (e.g. Matillion)
- Experience with AWS data ecosystem (or other cloud providers)
- Experience with modern data visualization platforms such as Sisense (formerly Periscope Data), Google Data Studio, Tableau, MS Power BI etc.
- Knowledge of modern real-time data pipelines is a strong plus (e.g. server less framework, lambda, kinesis, etc.)
- Knowledge or relational relational and dimensional data models
- Knowledge of terminal operations and Linux workflows
- World-class SQL skills across a variety of relational data warehousing technologies especially in cloud data warehousing (e.g. Amazon Redshift, Google BigQuery, Snowflake, Vertica, etc.)
- Ability to communicate insights and findings to a non-technical audience
What We Offer:
- A fast paced, high performing team.
- Multicultural environment with over 50 different nationalities
- Competitive Tax-free Salary
- Comprehensive Health Insurance
- Annual Air Ticket Allowance
- Employee discounts at multiple vendors across the emirates
- Rewards & Recognitions
- Learning & Development
Dubizzle Group is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
#J-18808-LjbffrIs this job a match or a miss?
Data Engineer
Posted today
Job Viewed
Job Description
8. Education / Qualifications / Professional Training
Bachelor’s Degree in Computer Science or Management with 4+ years of experience in Vertica and equivalent Databases
- Vertica Certification is a plus.
- Experience with data visualization tools (e.g., Power BI, SAP BI, SAS, Tableau) for data reporting and dashboard creation is beneficial.
Above 4 yrs exp required in Vertica Database Functionalities
8.3 Technical Competencies- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Engineer or similar role with hands-on expertise in designing and managing data solutions in Vertica.
- Strong proficiency in SQL and experience with data modeling and schema design in Vertica.
- In-depth knowledge of ETL processes and tools, particularly for data integration into Vertica.
- Familiarity with other big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is advantageous.
- Understanding of data warehousing concepts and best practices.
- Experience in performance tuning and optimization of Vertica databases.
- Familiarity with Linux environments and shell scripting for data-related automation tasks is a plus.
- Excellent problem-solving skills and the ability to handle large datasets effectively.
- Strong communication and collaboration skills to work effectively within a team-oriented environment.
- Self-motivated, with the ability to work independently and manage multiple tasks and projects simultaneously.
Is this job a match or a miss?
Data Engineer
Posted today
Job Viewed
Job Description
About the Role
We are an emerging AI-native product-driven, agile start-up under Abu Dhabi government AND we are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms, pipelines, and ML enablement within a Databricks on Azure environment.
As part of a stream-aligned delivery team, you’ll work closely with Data Scientists, Architects, and Product Managers to build scalable, high-quality data solutions for clients. You'll be empowered by a collaborative environment that values continuous learning, Agile best practices, and technical excellence.
Ideal candidates have strong hands-on experience in Databricks, Python, ADF and are comfortable in fast-paced, client-facing consulting engagements.
Skills and Experience requirements
1. Technical
- Databricks (or similar) e.g. Notebooks (Python, SQL), Delta Lake, job scheduling, clusters, and workspace management, Unity Catalog, access control awareness
- Cloud data engineering – ideally Azure, including storage (e.g., ADLS, S3, ADLS), compute, and secrets management
- Development languages such as Python, SQL, C#, javascript etc. especially data ingestion, cleaning, and transformation
- ETL / ELT – including structured logging, error handling, reprocessing strategies, APIs, flat files, databases, message queues, event streaming, event sourcing etc.
- Automated testing (ideally TDD), pairing/mobbing. Trunk Based Development, Continuous Deployment and Infrastructure-as-Code (Terraform)
- Git and CI/CD for notebooks, data pipelines, and deployments
2. Integration & Data Handling
- Experienced in delivering platforms for clients – including file transfer, APIS (REST etc.), SQL/NoSQL/graph databases, JSON, CSV, XML, Parquet etc
- Data validation and profiling - assess incoming data quality. Cope with schema drift, deduplication, and reconciliation
- Testing and monitoring pipelines: Unit tests for transformations, data checks, and pipeline observability
3. Working Style
- Comfortable leveraging the best of lean, agile and waterfall approaches. Can contribute to planning, estimation, and documentation, but also collaborative daily re-prioritisation
- Able to explain technical decisions to teammates or clients
- Documents decisions and keeps stakeholders informed
- Comfortable seeking support from other teams for Product, Databricks, Data architecture
- Happy to collaborate with Data Science team on complex subsystems
Nice-to-haves
- MLflow or light MLOps experience (for the data science touchpoints)
- Dbt / dagster / airflow or similar transformation tools
- Understanding of security and compliance (esp. around client data)
- Past experience in consulting or client-facing roles
Candidate Requirements
- 5–8 years (minimum 3–4 years hands-on with cloud/data engineering, 1–2 years in Databricks/Azure, and team/project leadership exposure)
- Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, Information Systems, Data Engineering
Job Type: Full-time
BenefitsVisa, Insurance, Yearly Flight Ticket, Bonus scheme, relocation logistics covered
Interviewing process consists of 2 or 3 technical/behavioral interviews
#J-18808-LjbffrIs this job a match or a miss?