195 Data Architect jobs in the United Arab Emirates
Big Data Architect
Posted 1 day ago
Job Viewed
Job Description
Job Title: Senior Data Engineer - Big Data/ Hadoop Ecosystem
Role OverviewAs a seasoned Senior Data Engineer , you will lead cutting-edge data initiatives in the banking sector, leveraging expertise in the Hadoop ecosystem to architect, build, and optimize large-scale data systems. Your mentorship skills will be invaluable as you guide a team of data engineers.
This role is perfect for individuals who thrive in collaborative environments and are passionate about delivering robust data solutions that drive business growth.
Main Responsibilities- Design, develop, and optimize scalable data processing systems using the Hadoop ecosystem (HDFS, MapReduce, Hive, Pig, HBase, Flume, Sqoop) and other Big Data technologies such as Java, MapReduce, Hive, and Spark.
- Lead and mentor a team of data engineers to ensure timely and high-quality project delivery, fostering a culture of innovation and excellence.
- Engineer, tune, and maintain complex data pipelines in Java, MapReduce, Hive, and Spark, including implementing stream-processing with Spark-Streaming.
- Design and build efficient dimensional data models and scalable architectures to empower analytics and business intelligence, driving informed decision-making across the organization.
- Oversee data integrity analysis, deployment, validation, and auditing of data models for accuracy and operational excellence, ensuring the highest standards of data quality.
- Leverage advanced SQL skills for performance tuning and optimization of data jobs, maximizing system efficiency and throughput.
- Collaborate with business intelligence teams to deliver industry-leading dashboards and data products that meet the evolving needs of stakeholders.
- 10+ years of hands-on experience as a Big Data Engineer, with deep technical expertise in the Hadoop ecosystem (Cloudera preferred), Apache Spark, and distributed data frameworks.
- Proven experience leading backend/distributed data systems teams while remaining technically hands-on, driving results-oriented projects that exceed expectations.
- Advanced proficiency in Java for MapReduce development, as well as strong skills in Python and/or Scala, ensuring adaptability and flexibility in a rapidly changing environment.
- Expertise in Big Data querying tools including Hive, Pig, and Impala, providing a competitive edge in data analysis and insights.
- Strong experience with both relational (Postgres) and NoSQL databases (Cassandra, HBase), enabling seamless data integration and management.
- Solid understanding of dimensional data modeling and data warehousing principles, driving data-driven decision-making and strategic business growth.
- Proficient in Linux/Unix systems and shell scripting, ensuring effective system administration and maintenance.
- Experience with Azure cloud services (Azure Data Lake, Databricks, HDInsight), facilitating cloud-based data management and scalability.
- Knowledge of stream-processing frameworks such as Spark-Streaming or Storm, enabling real-time data processing and analytics.
- Background in Financial Services or Banking industry, with exposure to data science and machine learning tools, enhancing data-driven business strategies.
Senior Big Data Architect
Posted 1 day ago
Job Viewed
Job Description
We are seeking a skilled professional to lead the development and maintenance of scalable data pipelines that ensure high data quality and availability across our organization. This role requires expertise in big data ecosystems, cloud-native tools, and advanced data processing techniques.
The ideal candidate will have hands-on experience with data ingestion, transformation, and optimization on distributed computing platforms, along with a proven track record of implementing data engineering best practices. You will work closely with other data engineers to build solutions that drive impactful business insights.
Key Responsibilities:
- Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy.
- Data Ingestion: Implement and manage data ingestion processes from various sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP.
- Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements.
- Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes.
- Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline.
- Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem.
- Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes.
- Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives.
- Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations.
Qualifications and Skills:
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
- 3+ years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform.
- Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques.
- Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase.
- Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala).
- Familiarity with Hadoop, Kafka, and other distributed computing tools.
- Experience with Apache Oozie, Airflow, or similar orchestration frameworks.
- Strong scripting skills in Linux.
Soft Skills:
- Strong analytical and problem-solving skills.
- Excellent verbal and written communication abilities.
- Ability to work independently and collaboratively in a team environment.
- Attention to detail and commitment to data quality.
Data Architect
Posted today
Job Viewed
Job Description
Our client, one of the leading IT consulting firmsin Abu Dhabi, is seeking a Data Architect/Data Modeler for a 12-month remote contract.The Enterprise Data Architect is responsible for identifying data and analytics requirements across all use cases, applying logical and analytical thinking to design creative and innovative database solutions.
Key Responsibilities:
- Shape and guide the client's Data & Analytics strategy.
- Demonstrate strong expertise in data warehouse design, including dimensional modeling, data vault, star schema, and snowflake schema.
- Adapt and refine designs on the fly, providing clear justification for changes.
- Develop data models using tools such as Visio and Erwin.
- Create mapping documents and present them to ETL and testing teams to enable implementation.
- Collaborate with other architects and business stakeholders to gather, validate requirements, and secure approvals.
- Oversee multiple initiatives and projects from initiation to completion.
- Drive maximum value from data and analytics initiatives.
Qualifications:
- Bachelor's degree (preferably in Computer Science, Information Systems, or a related field).
- At least 8 years of IT experience, including at least 3 years in information systems design or data architecture.
- Proven expertise in designing and implementing information solutions, with deep knowledge of database structure principles (transactional modeling, dimensional modeling – star schema, data vault).
- Strong leadership and people management skills, with experience working across diverse cultures; capable of inspiring teams, driving agendas, coaching, and enhancing performance.
- Hands-on experience in implementing data and analytics management programs is a plus.
- Ability to assess project, program, and portfolio requirements, determine necessary resources, and navigate cross-functional challenges to achieve objectives.
Data Architect
Posted today
Job Viewed
Job Description
Data Architect (Banking Domain) – 2 Months Contract, Dubai
Duration:
2 Months
Experience:
8–12 years architecting data integrations, warehouses, or lakes for banks
Key Skills & Responsibilities:
- Deep expertise in
Oracle ODI / PL SQL
or
Informatica
, with exposure to cloud data platforms (Azure, GCP, or OCI) - Source-to-target mapping & data model design
- ETL/ELT architecture & CDC strategies
- Data quality & cleansing frameworks (DQS, Trillium, Collibra)
- Performance tuning & partitioning for large Finance fact tables
- Metadata & lineage documentation (Erwin / PowerDesigner)
- Proven track record extracting and modeling
Flexcube data (LD, FT, AC, ST, etc.)
via ODI/Informatica or native PL SQL - Knowledge of
Flexcube APIs, staging schemas, and performance tuning
strategies for large AC/FT tables
Data Architect
Posted today
Job Viewed
Job Description
Overview:
Responsible for designing, implementing, and managing the organization's data architecture to ensure data availability, accuracy, security, and scalability.
Key Responsibilities:
- Design and optimize data models, databases, and pipelines.
- Define and enforce data governance, standards, and best practices.
- Collaborate with business and technical teams to translate requirements into scalable data solutions.
- Ensure data quality, integrity, and security across systems.
- Evaluate and implement data tools, platforms, and cloud solutions (AWS, Azure, GCP).
Skills & Experience:
- Strong knowledge of data modeling, ETL/ELT, and data warehousing.
- Proficiency in SQL, NoSQL, and big data technologies.
- Experience with cloud platforms and modern data architectures (e.g., lakehouse, streaming).
- Understanding of data governance, compliance, and security.
Data Architect
Posted today
Job Viewed
Job Description
We seek an experienced Data Architect to lead the design, development, and maintenance of our enterprise Data Platform. You will collaborate with Data Engineers, BI Engineers, and Data Business Analysts across various platform layers, ensuring a scalable, secure, and high-performance data infrastructure. This role requires a deep understanding of cloud-based big data ecosystems (AWS & Azure), real-time data processing, and enterprise data governance.
- Lead the roadmap and enhancements for the enterprise Big Data Platform, ensuring scalability, reliability, and cost efficiency.
- Architect end-to-end data solutions that fulfill business intelligence and analytics needs while ensuring best practices in security, compliance, and operational efficiency.
- Lead and mentor a team of Data Engineers, BI Engineers, and Cloud Architects to implement high-quality, production-ready solutions.
- Design and implement modern data architectures, including data lakes, data warehouses, and real-time streaming solutions, leveraging both AWS and Azure technologies.
- Build and optimize scalable, near real-time data ingestion pipelines, ensuring low-latency, high-throughput data processing for analytics and operational reporting.
- Architect and manage data streaming solutions using technologies such as Kafka, Kinesis, Apache Flink, or Spark Streaming, ensuring fault tolerance, consistency, and efficiency in real-time workflows.
- Ensure data security, governance, and compliance across all platforms, integrating IAM, encryption, access control, and data lineage tracking.
- Collaborate with stakeholders to define and implement data integration strategies, including ETL, API-based data exchange, and event-driven architectures.
- Drive automation and DevOps best practices for data pipeline deployment and monitoring, using tools such as Terraform, CloudFormation, and CI/CD pipelines.
- Stay updated with the latest big data, cloud, and AI/ML advancements, and incorporate them into the organization's data strategy.
Requirements
- 10+ years of experience in enterprise data architecture, integration, and analytics, with a strong track record in big data solutions.
- Experience in leading and mentoring teams of Data Engineers and BI Developers.
- Bachelor's/Master's degree in Computer Science, Data Engineering, or a relevant field from an accredited institution.
- Strong hands-on experience with:
- Big Data and Cloud Solutions on AWS & Azure, including data lakes, data warehouses, and scalable analytics platforms.
- Data modeling (relational, dimensional, NoSQL) and best practices for high-performance data systems.
- Enterprise data integration (ETL, messaging, streaming, APIs) with tools like Apache NiFi, AWS Glue, Azure Data Factory, Airflow, or Informatica.
- Data pipeline orchestration and event-driven architectures using Apache Spark, AWS Lambda, Azure Functions, Kinesis, or Kafka.
- Cloud security & governance frameworks (IAM, RBAC, encryption, GDPR, HIPAA, SOC2).
- BI & Analytics Tools: Power BI, AWS QuickSight, Tableau.
- Machine Learning & AI: Awareness of AI/ML principles and integration into data pipelines is a plus.
Preferred Certifications:
- Microsoft Certified: Azure Solutions Architect Expert
- AWS Certified Data Analytics – Specialty
Nice-to-Have Skills:
- Experience with Graph databases, Data Mesh architecture, and knowledge graphs.
- Hands-on experience with containerized & serverless architectures (Docker, Kubernetes, AWS Fargate).
- Experience working with real-time anomaly detection and predictive analytics.
Data Architect
Posted today
Job Viewed
Job Description
Data Architect
About Keyrock
Since our beginnings in 2017, we've grown to be a leading change-maker in the digital asset space, renowned for our partnerships and innovation.
Today, we rock with over 180 team members around the world. Our diverse team hails from 42 nationalities, with backgrounds ranging from DeFi natives to PhDs. Predominantly remote, we have hubs in London, Brussels, Singapore and Paris, and host regular online and offline hangouts to keep the crew tight.
We are trading on more than 80 exchanges, and working with a wide array of asset issuers. As a well-established market maker, our distinctive expertise led us to expand rapidly. Today, our services span market making, options trading, high-frequency trading, OTC, and DeFi trading desks.
But we're more than a service provider. We're an initiator. We're pioneers in adopting the Rust Development language for our algorithmic trading, and champions of its use in the industry. We support the growth of Web3 startups through our Accelerator Program. We upgrade ecosystems by injecting liquidity into promising DeFi, RWA, and NFT protocols. And we push the industry's progress with our research and governance initiatives.
At Keyrock, we're not just envisioning the future of digital assets. We're actively building it.
Position Overview
The Data Architect is responsible for designing, implementing, and maintaining an organization's data architecture and strategy, ensuring that data is collected, stored, and processed efficiently and securely to support business intelligence, data analytics, and machine learning operations (MLOps) practices.
Key Responsibilities
- Designing Data Architecture: Plan and implement a robust, scalable data architecture that integrates data from various sources and supports diverse analytical needs, while optimizing costs and meeting business requirements.
- Implementing Data Engineering Pipelines: Design and develop data pipelines for data extraction, transformation, and loading (ETL) processes, ensuring data quality and consistency.
- Enabling Data Intelligence and Analytics: Build and maintain data warehouses, data marts, and data lakes to support business intelligence and data analytics initiatives.
- Supporting MLOps Practices: Collaborate with data scientists and machine learning engineers to design and implement data infrastructure and processes that support machine learning model development, deployment, and maintenance.
- Ensuring Data Security and Compliance: Implement security measures, policies, and procedures to safeguard data privacy and comply with relevant regulations.
- Data Governance and Management: Establish and enforce data governance policies and standards to ensure data quality, integrity, and accessibility.
- Collaborating with Cross-Functional Teams: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and translate them into technical solutions.
- Staying Abreast of Technological Advancements: Keep up-to-date with emerging technologies and trends in data architecture, data engineering, and MLOps to identify opportunities for improvement and innovation.
- Optimizing Data Performance: Monitor and analyze data processing performance, identify bottlenecks, and implement optimizations to enhance efficiency and scalability.
- Documentation and Knowledge Sharing: Create and maintain comprehensive documentation of data architecture, models, and processing workflows.
Technical Requirements
- Extensive experience in data architecture design and implementation.
- Strong knowledge of data engineering principles and practices.
- Expertise in data warehousing, data modelling, and data integration.
- Experience in MLOps and machine learning pipelines.
- Proficiency in SQL and data manipulation languages.
- Experience with big data platforms (including Apache Arrow, Apache Spark, Apache Iceberg, and Clickhouse) and cloud-based infrastructure on AWS.
Education & Qualifications
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field, or equivalent experience.
Preferred certifications (optional):
AWS Cloud Data Engineer
- AWS Machine Learning Ops Engineer
Leadership & Collaboration
- Passion for building scalable, reliable, and secure systems in a fast-paced environment.
- Ability to translate complex technical concepts into clear, actionable insights for technical teams.
- Strong interpersonal skills with the ability to work effectively across cross-functional teams.
- Excellent problem-solving and analytical skills.
Our recruitment philosophy
We value self-awareness and powerful communication skills in our recruitment process. We seek fiercely passionate people who understand themselves and their career goals. We're after those with the right skills and a conscious choice to join our field. The perfect fit? A crypto enthusiast who's driven, collaborative, acts with ownership and delivers solid, scalable outcomes.
Our offer
- Competitive salary package
- Autonomy in your time management thanks to flexible working hours and the opportunity to work remotely
- The freedom to create your own entrepreneurial experience by being part of a team of people in search of excellence
As an employer we are committed to building a positive and collaborative work environment. We welcome employees of all backgrounds, and hire, reward and promote entirely based on merit and performance.
Due to the nature of our business and external requirements, we perform background checks on all potential employees, passing which is a prerequisite to join Keyrock.
Be The First To Know
About the latest Data architect Jobs in United Arab Emirates !
Data Architect
Posted today
Job Viewed
Job Description
Databricks Data Architect, UAE
Working for a large regional Insurance company we are looking for a skilled Data Architect with experience to design, develop, and maintain scalable data pipelines and data solutions using Databricks.
You Will:
- Design and implement scalable data architectures using Databricks Unity Catalog, Delta Lake, and Apache Spark
- Develop and maintain complex ETL/ELT pipelines processing terabytes of data daily
- Architect medallion architecture (Bronze, Silver, Gold) data lakehouses with optimal performance patterns
- Implement real-time and batch data processing solutions using Structured Streaming and Delta Live Tables
- Design data mesh architectures with proper data governance and lineage tracking
- Optimize Spark jobs for cost efficiency and performance, including cluster auto-scaling and resource management
- Implement advanced Delta Lake features including time travel, vacuum operations, and Z-ordering
- Build robust data quality frameworks with automated testing and monitoring
- Design and implement CI/CD pipelines for data workflows using Databricks Asset Bundles or similar tools
- Develop custom solutions using Databricks APIs and integrate with external systems
Qualifications
You Have:
- A
Databricks Certified Professional Data Engineer certification - Proficiency in Python, PySpark and SQL.
- 10+ years of experience in data engineering, including designing and deploying production-quality data pipelines and ETL/ELT solutions.
- Strong hands-on experience with cloud services such as AWS (Glue, Lambda, Redshift, S3).
- Experience with data governance and management platforms (e.g., AWS Sagemaker Unified Studio and Unity Catalog.
- Experience with data migration projects and legacy system modernization
- Familiarity with containerization and orchestration technologies (Docker, Kubernetes) is a plus.
We can also take UK / EU Nationals and will support local visa, flights & relocation.
Contact me ASAP to discuss project details.
Thanks
Xavier
Data Architect
Posted today
Job Viewed
Job Description
Leftfield is partnering with a confidential client to hire a
Data Solutions Architect
to join their client's
AI and Big Data Center of Excellence
in Abu Dhabi. The center is a hub for cutting-edge innovation across Strategy, Governance, Data Science, AI Engineering, Data Architecture, and Visual Insights. This role offers a unique opportunity to be part of a transformative journey at the intersection of data, AI, and fintech across markets.
As a
Data Solutions Architect
, you will lead enterprise-wide data strategy and architecture initiatives, playing a critical role in scaling data-intensive applications and ensuring best practices across data ecosystems. This is a high-impact role that requires deep technical expertise, business acumen, and strong stakeholder engagement.
Key Responsibilities
- Lead data architecture design, implementation, and optimization across the enterprise.
- Coordinate across teams to deliver consistent and scalable data products.
- Collaborate with stakeholders to translate business needs into technical data solutions.
- Promote and implement data-as-a-product mindset across the organization.
- Champion best practices in data privacy, governance, security, and compliance.
- Partner with global teams, cybersecurity, and business leaders to ensure secure and meaningful use of data.
- Enable data-driven decision-making through efficient API development, modeling, and harmonization.
- Mentor and support engineering teams in resolving architectural and technical challenges.
What You Bring
- Bachelor's, Master's, or PhD in Computer Science, Data Science, AI, or a related field.
- Extensive experience in data architecture and delivering scalable data-intensive applications.
- Strong understanding of cloud platforms (AWS, Azure, GCP) and cloud-native data technologies.
- Proficient in relational, NoSQL, graph databases, vector stores, and data warehousing (Kimball, Inmon, Data Vault).
- Hands-on experience with decentralized Data Mesh and MLOps is desirable.
- Proven track record in the fintech sector is a strong advantage.
Expected Milestones
Month 1:
- Assess current data ecosystem and identify short-term improvements.
- Build strong relationships with data governance, security, and engineering teams.
Month 3:
- Present a high-level 6–12 month roadmap.
- Propose architectural solutions to support agile development and scalability.
Month 6:
- Refine and scale enterprise architecture based on feedback.
- Drive strategic initiatives that align data architecture with business objectives.
What Sets You Apart
- Exceptional problem-solving and architectural solutioning skills.
- Ability to bridge technical and business stakeholders effectively.
- Proven impact in the fintech sector and innovation-driven teams.
- Passion for learning and growing in a fast-paced, collaborative environment.
Data Architect
Posted 1 day ago
Job Viewed
Job Description
At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
The RoleKyndryl is seeking an accomplished Senior Data Architect, an individual contributor and team leader, to design, build, and operate robust data ecosystems for our customers. This pivotal role involves shaping and managing data architectures that enable scalable, secure, and innovative data solutions aligned with business strategies.
Responsibilities- Design and implement modern data architectures encompassing data foundations, data fabric, and integration with AI and agentic AI systems.
- Develop and maintain comprehensive data models (conceptual, logical, physical) and schema designs that support large-scale enterprise environments.
- Lead the design of ETL processes and DataOps workflows ensuring efficient, high-quality data integration and transformation.
- Establish and enforce data governance policies, metadata management, and data quality standards.
- Collaborate with cross-functional teams including business stakeholders, data engineers, AI platform architects, and technical leadership to translate requirements into scalable data solutions.
- Oversee and guide team members and other data architects, providing mentorship and technical direction.
- Evaluate and adopt technologies for cloud data platforms, data lakes, warehouses, and streaming data ecosystems.
- Drive cost optimization and performance tuning of data platforms.
- Ensure compliance with industry standards and regulatory requirements.
- Promote best practices in data security, access control, and data lineage management.
- Lead innovation in data architecture strategies to support cutting-edge AI / ML initiatives and agentic AI system integration.
- Bachelor's degree in computer science, Information Systems, Engineering, or related fields; Master's degree preferred.
- Minimum 10+ years' experience designing and implementing large-scale data architectures and data foundations.
- Deep expertise in data modeling, schema design, ETL framework development, and DataOps.
- Strong knowledge of modern data ecosystem technologies including but not limited to cloud data platforms (AWS, Azure, GCP), data lakes, data warehouses (Snowflake, Redshift, Synapse), streaming platforms (Kafka), and metadata / catalog tools.
- Knowledge on Palantir Foundry, (Palantir AIP is preferred)
- Proficiency with integration of data architectures with AI platforms and familiarity with agentic AI systems.
- Solid understanding of data governance, data security, and compliance frameworks.
- Proven team player with excellent collaboration skills across business and technical teams.
- Experience with data architecture supporting financial services or related regulated industries is a strong plus.
- Experience in requirements gathering and translating business needs into technical designs.
- Proficiency with DevOps and CI / CD practices for data workflows.
- Strong architecture skills with hands-on cloud platform experience.
- Excellent communication, presentation, and organizational abilities to lead effective stakeholder engagement.
- Certifications in data architecture or cloud platforms (, CDMP, AWS Certified Solutions Architect) are advantageous.
- Arabic language skills
This role offers a unique opportunity to architect the future of data ecosystems at Kyndryl, driving innovation and enabling data-driven decision-making at enterprise scale. We seek a strategic thinker with deep technical acumen and leadership abilities who thrives designing foundational data capabilities that empower advanced AI and analytics initiatives.
#AgenticAI
Being YouDiversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily : Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
What You Can ExpectWith state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learningprograms give you access to the best learning in the industry to receive certifications, includingMicrosoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
Get ReferredIf you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
#J-18808-Ljbffr