62 Data Pipelines jobs in the United Arab Emirates

Senior Manager – Data & AI Architecture

Abu Dhabi, Abu Dhabi Solutions+

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Role Description: Data Architecture Strategy & Design: • Define and implement enterprise-wide data architecture strategies, ensuring scalability, performance, and security. • Develop data models, pipelines, and storage solutions that support structured, semi-structured, and unstructured data. • Ensure alignment between data architecture, analytics, AI/ML models, and business intelligence (BI) platforms. • Drive the adoption of data lakehouse, cloud data platforms (Azure Synapse, AWS Redshift, Google BigQuery), and distributed computing frameworks. • Work closely with data engineering and analytics teams to optimize data pipelines and transformations. Cloud & Big Data Ecosystem Integration: • Lead the design and implementation of cloud-based data solutions, ensuring seamless integration with enterprise applications. • Define best practices for data ingestion, ETL/ELT, and real-time data streaming (Kafka, Spark, Databricks, etc.). • Ensure interoperability between on-premise, hybrid, and multi-cloud data environments. • Optimize data storage, processing, and retrieval across cloud platforms, improving cost efficiency and performance. Data Governance & Compliance: • Implement data governance frameworks, ensuring compliance with GDPR, HIPAA, ISO 27001, and other regulatory standards. • Define metadata management, data lineage, and data cataloging best practices. • Work with security teams to enforce data access controls, encryption, and role-based access management (RBAC). • Ensure data quality and integrity across the organization’s analytical and AI platforms. Data Integration & Interoperability: • Develop and maintain enterprise data integration frameworks, enabling smooth data exchange between systems. • Lead API-driven data integrations, event-driven architectures, and message queues for real-time data movement. • Establish data fabric and data mesh approaches, enabling a scalable, decentralized data ecosystem. • Work closely with business units, AI teams, and application developers to provide accessible, high-quality data. AI & Advanced Analytics Enablement: • Support AI and ML model development by ensuring high-quality data availability. • Define data architecture principles that optimize AI pipelines and feature engineering processes. • Work with AI engineers to deploy and operationalize AI models in cloud and edge environments. • Ensure efficient data versioning, feature stores, and reproducibility frameworks for AI workflows. Leadership & Team Development: • Lead a team of data architects, data modelers, and data integration specialists. • Foster a culture of data-driven decision-making and continuous learning. • Develop training programs and best practices for data architecture governance, performance optimization, and security. • Collaborate with cross-functional teams, including Data Science, Engineering, and Business Intelligence teams.

• Bachelor’s or master’s degree in computer science, Data Engineering, or related field. • 10+ years of experience in data architecture, big data engineering, or cloud data solutions. • Proven expertise in Azure, AWS, and/or Google Cloud-based data ecosystems (with certifications, Azure required). • Experience with data modeling tools, metadata management, and data governance platforms. • Strong understanding of data mesh, data fabric, and decentralized data architectures. • Background in leading and managing data architecture teams in enterprise environments. • Familiarity with BI/analytics tools such as Power BI, Tableau, or Looker. • Experience in supporting AI/ML workflows with structured, high-quality data pipelines

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Sr. Manager – Data & AI Architecture

Abu Dhabi, Abu Dhabi Solutions plus

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

2 weeks ago Be among the first 25 applicants

Get AI-powered advice on this job and more exclusive features.

Direct message the job poster from Solutions+ (A Mubadala company)

To design and implement enterprise data architecture, governance, and integration frameworks that support both internal data operations and client-facing solutions. This role is integral in ensuring scalable, secure, and high-performing data architectures that align with business intelligence, analytics, and AI-driven initiatives. To drive best practices in data modeling, cloud data platforms, data lakes, and real-time data processing while working closely with engineering, AI, and business teams to enable data-driven decision-making. Additionally, this role will serve as an AI Architect, guiding AI driven solutions by aligning Ai strategy with enterprise goals.

Role Description

  • Data Architecture Strategy & Design
  • Data Governance & Compliance
  • AI & Advanced Analytics Enablement
  • Leadership & Team Development

Job Specific Knowledge and Skills:

  • Expertise in enterprise data architecture, cloud data platforms, and big data technologies.
  • Strong knowledge of data modeling, data lakes, data warehouses, and real-time processing architectures.
  • Proven ability to design scalable, secure, and high-performance data solutions.
  • Experience in data governance, metadata management, and regulatory compliance.
  • Strong background in ETL/ELT processes, API-based data integrations, and event-driven architectures.
  • Ability to optimize AI/ML pipelines through structured data architecture frameworks.
  • Proficiency in SQL, Python, Spark, and cloud-based data engineering tools.
  • Strong leadership in team development, cross-functional collaboration, and stakeholder engagement.
  • Ability to drive innovation in data management, enabling AI and advanced analytics capabilities.
  • Experience in cost optimization and performance tuning for cloud-based data solutions.

Preferred Qualifications & Experience:

  • Bachelor’s or Masters degree in Computer Science, Data Engineering, or related field.
  • 10+ years of experience in data architecture, big data engineering, or cloud data solutions.
  • Proven expertise in Azure, AWS, and/or Google Cloud-based data ecosystems (with certifications, Azure required).
  • Experience with data modeling tools, metadata management, and data governance platforms.
  • Strong understanding of data mesh, data fabric, and decentralized data architectures.
  • Background in leading and managing data architecture teams in enterprise environments.
  • Familiarity with BI/analytics tools such as Power BI, Tableau, or Looker.
  • Experience in supporting AI/ML workflows with structured, high-quality data pipelines.
Seniority level
  • Seniority levelMid-Senior level
Employment type
  • Employment typeFull-time
Job function
  • Job functionEngineering, Information Technology, and Other
  • IndustriesIT Services and IT Consulting, Business Consulting and Services, and Technology, Information and Media

Referrals increase your chances of interviewing at Solutions+ (A Mubadala company) by 2x

Sign in to set job alerts for “Manager of Artificial Intelligence” roles.

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 6 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 days ago

Senior Business Development Manager – AI & IT Solutions

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago

Corporate and Performance Reporting Manager

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 months ago

Business Development Manager - Digitalisation/ AI Solutions

Abu Dhabi Emirate, United Arab Emirates 2 weeks ago

Abu Dhabi Emirate, United Arab Emirates 1 week ago

Product Manager - Healthcare AI Solutions

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Manager – Data & AI Architecture

Abu Dhabi, Abu Dhabi Solutions+

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

The Role
Role Description: Data Architecture Strategy & Design: • Define and implement enterprise-wide data architecture strategies, ensuring scalability, performance, and security. • Develop data models, pipelines, and storage solutions that support structured, semi-structured, and unstructured data. • Ensure alignment between data architecture, analytics, AI/ML models, and business intelligence (BI) platforms. • Drive the adoption of data lakehouse, cloud data platforms (Azure Synapse, AWS Redshift, Google BigQuery), and distributed computing frameworks. • Work closely with data engineering and analytics teams to optimize data pipelines and transformations. Cloud & Big Data Ecosystem Integration: • Lead the design and implementation of cloud-based data solutions, ensuring seamless integration with enterprise applications. • Define best practices for data ingestion, ETL/ELT, and real-time data streaming (Kafka, Spark, Databricks, etc.). • Ensure interoperability between on-premise, hybrid, and multi-cloud data environments. • Optimize data storage, processing, and retrieval across cloud platforms, improving cost efficiency and performance. Data Governance & Compliance: • Implement data governance frameworks, ensuring compliance with GDPR, HIPAA, ISO 27001, and other regulatory standards. • Define metadata management, data lineage, and data cataloging best practices. • Work with security teams to enforce data access controls, encryption, and role-based access management (RBAC). • Ensure data quality and integrity across the organization’s analytical and AI platforms. Data Integration & Interoperability: • Develop and maintain enterprise data integration frameworks, enabling smooth data exchange between systems. • Lead API-driven data integrations, event-driven architectures, and message queues for real-time data movement. • Establish data fabric and data mesh approaches, enabling a scalable, decentralized data ecosystem. • Work closely with business units, AI teams, and application developers to provide accessible, high-quality data. AI & Advanced Analytics Enablement: • Support AI and ML model development by ensuring high-quality data availability. • Define data architecture principles that optimize AI pipelines and feature engineering processes. • Work with AI engineers to deploy and operationalize AI models in cloud and edge environments. • Ensure efficient data versioning, feature stores, and reproducibility frameworks for AI workflows. Leadership & Team Development: • Lead a team of data architects, data modelers, and data integration specialists. • Foster a culture of data-driven decision-making and continuous learning. • Develop training programs and best practices for data architecture governance, performance optimization, and security. • Collaborate with cross-functional teams, including Data Science, Engineering, and Business Intelligence teams.

Requirements
• Bachelor’s or master’s degree in computer science, Data Engineering, or related field. • 10+ years of experience in data architecture, big data engineering, or cloud data solutions. • Proven expertise in Azure, AWS, and/or Google Cloud-based data ecosystems (with certifications, Azure required). • Experience with data modeling tools, metadata management, and data governance platforms. • Strong understanding of data mesh, data fabric, and decentralized data architectures. • Background in leading and managing data architecture teams in enterprise environments. • Familiarity with BI/analytics tools such as Power BI, Tableau, or Looker. • Experience in supporting AI/ML workflows with structured, high-quality data pipelines

About the company
Solutions+, is a wholly owned subsidiary of Mubadala Investment Company. Established 10 years ago, we are the leading UAE shared services company offering a range of solutions from finance, human resources, IT, procurement, facilities, and sustainability. Solutions+ portfolio of brands cover various service sectors across the UAE including sports and entertainment, in addition to business processes and ESG. Our vision is to drive value, for our clients and our nation, by providing world-class business performance solutions. Leveraging our deep knowledge and expertise, sustainable processes, and cutting-edge technologies, we offer direct management and counsel across vital infrastructure functions, from operations to digital services.
This advertiser has chosen not to accept applicants from your region.

Data & Integration Specialist

New
Abu Dhabi, Abu Dhabi Gulf Data International

Posted today

Job Viewed

Tap Again To Close

Job Description

How to apply: Data & Integration Specialist

GDI takes pride in creating opportunities for personal growth and achievement. To apply for a job opening, please email your latest CV in MS Word or PDF format to , with the job title applied for in the subject line.

Job Description

We are actively seeking a Data & Integration Specialist with expertise in Oracle Database to join our team. In this role, you will be responsible for the creation, implementation, and maintenance of data models essential for supporting the integration, reporting, and analytics functions of the organization. Your responsibilities will involve close collaboration with business stakeholders, data analysts, and IT teams to understand data requirements and transform them into efficient and robust data models.

Responsibilities
  1. Collaborate with business stakeholders to comprehensively understand integration, reporting, and analytics data requirements and objectives.
  2. Design and develop conceptual, logical, and physical data models in alignment with Oracle Database standards.
  3. Exhibit proficiency in translating business requirements into precise data models.
  4. Work closely with Oracle Database administrators and SAP specialists to ensure seamless alignment of data models with integration needs.
  5. Collaborate with ETL developers and data engineers to ensure smooth data integration across various databases, SAP, and other relevant systems.
  6. Utilize programming languages such as .NET, Node.js, Python, etc., for REST-based service implementation (desirable).
  7. Preference for experience with middleware technologies such as SoapUI, Postman, and other middleware.
  8. Bachelor’s degree in computer science, Information Technology, or a related field.
  9. 3-5 years of experience as a data modeler and integration expert, including expertise with REST/SOAP-based services.
  10. Proficient knowledge of Oracle Database architecture, data structures, SQL, and hands-on experience with SAP data environments is preferred.
  11. Exceptional analytical and problem-solving skills, with a proven ability to translate business requirements into effective data models within Oracle Database and SAP landscapes.
  12. Strong communication and interpersonal abilities, enabling effective collaboration with stakeholders at all organizational levels, including Oracle Database administrators, SAP specialists, and business users.
  13. Capacity to thrive both independently and as part of a team in a fast-paced, dynamic work environment.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data & Integration Specialist

Abu Dhabi, Abu Dhabi Gulf Data International

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

How to apply: Data & Integration Specialist

GDI takes pride in creating opportunities for personal growth and achievement. To apply for a job opening, please email your latest CV in MS Word or PDF format to , with the job title applied for in the subject line.

Job Description

We are actively seeking a Data & Integration Specialist with expertise in Oracle Database to join our team. In this role, you will be responsible for the creation, implementation, and maintenance of data models essential for supporting the integration, reporting, and analytics functions of the organization. Your responsibilities will involve close collaboration with business stakeholders, data analysts, and IT teams to understand data requirements and transform them into efficient and robust data models.

Responsibilities
  • Collaborate with business stakeholders to comprehensively understand integration, reporting, and analytics data requirements and objectives.
  • Design and develop conceptual, logical, and physical data models in alignment with Oracle Database standards.
  • Exhibit proficiency in translating business requirements into precise data models.
  • Work closely with Oracle Database administrators and SAP specialists to ensure seamless alignment of data models with integration needs.
  • Collaborate with ETL developers and data engineers to ensure smooth data integration across various databases, SAP, and other relevant systems.
  • Utilize programming languages such as .NET, Node.js, Python, etc., for REST-based service implementation (desirable).
  • Preference for experience with middleware technologies such as SoapUI, Postman, and other middleware.
  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • 3-5 years of experience as a data modeler and integration expert, including expertise with REST/SOAP-based services.
  • Proficient knowledge of Oracle Database architecture, data structures, SQL, and hands-on experience with SAP data environments is preferred.
  • Exceptional analytical and problem-solving skills, with a proven ability to translate business requirements into effective data models within Oracle Database and SAP landscapes.
  • Strong communication and interpersonal abilities, enabling effective collaboration with stakeholders at all organizational levels, including Oracle Database administrators, SAP specialists, and business users.
  • Capacity to thrive both independently and as part of a team in a fast-paced, dynamic work environment.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Expertise in Data Integration

Dubai, Dubai beBeeDataServices

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: SAP BODS Professional

The role of a SAP BODS Professional involves working with our team to design and implement data services solutions using Business Objects Data Services (BODS). The ideal candidate will have a strong background in BODS, experience in preparing technical specification documents, and a solid understanding of developing real-time interfaces like IDOC, BAPI SAP applications.

Key Responsibilities:
  • Design and implement data services solutions using BODS.
  • Prepare technical specification documents for data services projects.
  • Develop real-time interfaces like IDOC, BAPI SAP applications using BODS.
  • Analyze and fix ETL issues independently.
  • Install and configure BODS.
  • Understand and manage BODS landscape and architecture.
  • Estimate BODS work effort.
Requirements:
  • More than 5 years of experience in BODS design and architecture.
  • Experience in preparing Technical Specification documents.
  • Strong hands-on experience in Business Objects Data Services (BODS) as a technical developer.
  • Thorough knowledge of developing real-time interfaces like IDOC, BAPI SAP applications with BODS.
  • Solid knowledge of SQL/PLSQL.
  • Experience in Data Migration Projects with end-to-end implementation.
  • Good analytical skills to independently analyze and fix ETL issues.
  • Experience in installation and configuration of BODS.
  • Knowledge of various transformations in BODS.
  • Understanding of BODS landscape and architecture.
  • Ability to estimate BODS work effort.
  • Knowledge of AWS Glue, Amazon Athena, DBT, Dagster is a plus.
Job Details:
  • Seniority level: Not Applicable
  • Employment type: Contract
  • Job function: Information Technology
  • Industries: IT Services and IT Consulting
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer - Systems Architecture, Design & Development

Ras Al Khaimah, Ra's al Khaymah Wynn Al Marjan Island

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

General Purpose
We are seeking an experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, a passion for innovation, and the technical expertise to design and implement robust, scalable, and efficient data systems. As a Grade 2 Data Engineer, you will focus on designing, optimizing, and maintaining complex data pipelines and workflows while collaborating with cross-functional teams to support data-driven decision-making across the organization. This role requires advanced technical skills and a commitment to delivering high-quality solutions
Nature & Scope
Essential Duties & Tasks

  • Lead architecture development and solution design sessions, contributing strategic insights to ensure robust, scalable, and efficient data systems
  • Collaborate with architects and cross-functional teams to design and implement end-to-end data solutions that address complex business needs
  • Develop advanced data assets, including highly scalable ETL/ELT pipelines, optimized data structures, and dynamic data workflows on Wynn’s cloud infrastructure
  • Drive data quality initiatives, including sophisticated data profiling, cleansing, and anomaly detection, leveraging advanced tools and methodologies
  • Establish, enforce, and continuously refine data quality standards while designing automated processes for issue identification and resolution
  • Act as a technical leader within the team, mentoring junior engineers, conducting detailed code reviews, and setting standards for best practices
  • Analyze and translate complex business requirements into detailed technical specifications, ensuring solutions align with broader organizational goals
  • Define, design, and implement advanced data ingestion patterns (batch, real-time streaming, and hybrid) tailored to diverse use cases and business requirements
  • Lead data modeling initiatives, developing and optimizing both logical and physical data models to support high-performance analytics and operations
  • Collaborate closely with stakeholders to design and deliver data models that meet advanced operational, analytical, and reporting needs
  • Drive data governance initiatives by contributing to the design and implementation of robust security, compliance, and data privacy frameworks
  • Ensure data integrity and quality by embedding validation, anomaly detection, and self-healing mechanisms throughout data workflows
  • Create comprehensive technical documentation, including data flows, pipeline designs, and operational runbooks to support system maintainability
  • Evaluate, recommend, and implement cutting-edge data ingestion, integration, and replication tools to enhance the efficiency of cloud-based analytics systems
  • Lead proof-of-concept projects to assess and introduce innovative data engineering technologies, tools, and frameworks
  • Design and implement highly reliable real-time data streaming solutions using platforms such as Azure Event Hubs, Kafka, or AWS Kinesis to meet time-sensitive business needs
  • Develop and maintain complex, reusable ETL/ELT processes that can handle large-scale, dynamic datasets efficiently and securely.
  • Own the maintenance and performance optimization of data ingestion tools, ensuring they meet high availability and reliability standards
  • Continuously optimize existing data pipelines, identifying and resolving performance bottlenecks to improve scalability and cost-efficiency
  • Provide advanced support to analytics and data science teams by curating high-quality, well-documented, and accessible datasets tailored to their needs
  • Lead efforts to monitor and proactively troubleshoot data pipelines, implementing automated alerts and self-healing mechanisms to minimize downtime
  • Research and propose strategies for adopting emerging trends in data engineering, driving innovation and process improvements.
  • Implement and manage robust CI/CD workflows to automate pipeline deployment, testing, and version control for seamless operations
  • Regularly upgrade and enhance data ingestion tools, ensuring system resilience and alignment with the latest industry best practices
  • Contribute to cross-functional projects, demonstrating expertise in serverless architectures, API integration (e.g., FastAPI), and scalable cloud solutions
  • Drive knowledge-sharing initiatives, such as training sessions and technical presentations, to elevate the team’s overall capabilities and expertise
Education
  • A Bachelor’s degree in computer science, information technology, or a related field is required; a Master’s degree is preferred but not mandatory
  • Minimum age 21
Experience
  • 5 to 7 years of hands-on experience in data engineering, demonstrating consistent career progression and technical growth
  • Proven ability to design, develop, and deploy highly scalable and efficient data solutions for complex business needs
  • Extensive experience managing and optimizing complex data integrations across diverse systems, platforms, and cloud environments
Skills / Knowledge
  • Advanced proficiency in programming languages such as Python, SQL, and Shell Scripting, with the ability to implement optimized and scalable code solutions
  • Deep expertise in data platforms like Snowflake and Databricks, including extensive experience working with PySpark and distributed dataframes to process large-scale datasets
  • Advanced knowledge of orchestration tools such as Azure Data Factory, Apache Airflow, and Databricks workflows, including the ability to design and manage complex, multi-step workflows
  • Significant hands-on experience with tools like DBT for data transformation and replication solutions such as Qlik for efficient data migration and synchronization
  • Strong understanding of big data systems and frameworks, with practical experience in building and optimizing solutions for high-volume and high-velocity data
  • Extensive experience with version control tools such as GitHub or Azure DevOps, including implementing CI/CD pipelines for data engineering workflows
  • Advanced knowledge of serverless computing, including designing and deploying scalable solutions using Azure Functions with Python
  • Proficiency in API development frameworks such as FastAPI, with the ability to create robust, efficient, and secure data-driven APIs
  • Comprehensive expertise in designing and implementing ETL/ELT processes with a focus on performance, scalability, and maintainability
  • Proven experience in data warehouse development, including hands-on expertise with dimensional modeling and schema optimization for analytics
  • Solid English language communication skills, both written and verbal, for effective collaboration across teams and stakeholders
Certifications Required/Preferred
  • Snowflake SnowPro Core Certification
  • Databricks Certified Data Engineer Professional
  • Microsoft Azure Data Engineer Associate
Work Conditions
This is an office-based position with regular working hoursSeniority level
  • Seniority levelMid-Senior level
Employment type
  • Employment typeFull-time
Job function
  • Job functionInformation Technology
  • IndustriesHospitality

Referrals increase your chances of interviewing at Wynn Al Marjan Island by 2x

Get notified about new Data System Engineer jobs in Ras al-Khaimah, United Arab Emirates.

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data pipelines Jobs in United Arab Emirates !

Senior ETL Developer

Abu Dhabi, Abu Dhabi Dicetek LLC

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Join or sign in to find your next job

Join to apply for the Senior ETL Developer role at Dicetek LLC

Continue with Google Continue with Google

Join to apply for the Senior ETL Developer role at Dicetek LLC

Get AI-powered advice on this job and more exclusive features.

Sign in to access AI-powered advices

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Continue with Google Continue with Google

Job Description
We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. The ideal candidate will have extensive experience with Informatica BDM and Databricks pipeline, along with strong knowledge of SQL and PowerShell. The candidate should be proficient in designing ETL workflows and possess excellent communication skills. An understanding of data modeling and DAX queries is a plus.

Job Description
We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. The ideal candidate will have extensive experience with Informatica BDM and Databricks pipeline, along with strong knowledge of SQL and PowerShell. The candidate should be proficient in designing ETL workflows and possess excellent communication skills. An understanding of data modeling and DAX queries is a plus.
Key Responsibilities

  • Design, develop, and maintain ETL processes using Informatica BDM and Databricks pipeline.
  • Collaborate with data architects and business analysts to understand data requirements and translate them into ETL solutions.
  • Optimize and troubleshoot ETL workflows to ensure high performance and reliability.
  • Develop and maintain scripts using Oracle SQL and PowerShell for data extraction, transformation, and loading.
  • Ensure data quality and integrity throughout the ETL process.
  • Document ETL processes, workflows, and data mappings.
  • Communicate effectively with team members, stakeholders, and management to provide updates and gather requirements.
  • Utilize data modeling techniques and DAX queries to enhance data analysis and reporting capabilities.
  • Leverage Azure services and tools to support ETL processes and data integration.
Qualifications
  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 7 years of experience in ETL development.
  • Strong experience with Informatica BDM and Databricks pipeline.
  • Proficient in Oracle SQL and PowerShell.
  • Experience in designing and optimizing ETL workflows.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and interpersonal skills.
  • Ability to work independently and as part of a team.
  • Understanding of data modeling and DAX queries is an added advantage.
  • Experience with Azure services and tools.
Preferred Qualifications
  • Knowledge of data warehousing concepts and best practices.
  • Certification in Informatica, Databricks, or Azure is a plus.
Seniority level
  • Seniority levelNot Applicable
Employment type
  • Employment typeContract
Job function
  • Job functionBusiness Development and Sales
  • IndustriesIT Services and IT Consulting

Referrals increase your chances of interviewing at Dicetek LLC by 2x

Sign in to set job alerts for “Senior ETL Developer” roles.

Continue with Google Continue with Google

Continue with Google Continue with Google

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED3,000 - AED7,000 2 days ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 9 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 6 months ago

Abu Dhabi Emirate, United Arab Emirates 1 week ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 11 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates
AED12,000.00
-
AED13,000.00
3 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 9 hours ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago

Systems Engineer - Avionics Software Development

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 days ago

Abu Dhabi Emirate, United Arab Emirates 2 weeks ago

Software Developer ( .NET core, Angular)

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 months ago

Full Stack Java Developer - Banking (m/f/d)

Abu Dhabi Emirate, United Arab Emirates 1 day ago

Front-End Developer - React Native and React (Web)

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

API Developer-Terraform, Azure, and API Gateway for Microservices

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED14,000 - AED15,000 3 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED14,000 - AED18,000 3 months ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago

Abu Dhabi Emirate, United Arab Emirates 2 weeks ago

Abu Dhabi Emirate, United Arab Emirates 3 weeks ago

Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 19 hours ago

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

ETL Developer - SAP BODS

Dubai, Dubai Dautom

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

In this role, you will have the opportunity to work closely with one of our esteemed clients. This client is a global leader in the Logistics Industry known for its commitment to quality and innovation. They have chosen Dautom as their trusted partner for their upcoming projects.

Job Title: ETL Developer - SAP BODS

Responsibilities:
  • Extract Data from External Sources/Web services and store into Data Lake.
  • Apply Transformations to prepare data by extracting from Source System Databases like Oracle, SQL Server.
  • Performing Data Quality Checks, Integration and cleansing of data before loading into data warehouses.
  • Proficient knowledge in Oracle SQL is mandatory.
  • Proficiency in Oracle/SQL to write Dynamic SQL Queries to extract data.
  • Good Understanding of structured/unstructured architecture data modelling concepts.
  • Skilled in the complete lifecycle of SAP BODS/BODI development, including analysis, design, testing, implementation and support as well as providing regular updates.
  • Responsible for delivering analyses for functional and business requirements.
  • Strong written and verbal communications skills.
  • Ability to recognize and clearly report relevant information.
Minimum Qualifications:
  • Bachelors degree in Computer Science, Computer Engineering, or any other relevant technical field.
  • 5+ Years of Experience in designing Data Marts/Models using SAP BODS/BODI ETL (or) Informatica Tools.
  • Knowledge in writing Python (or) R Scripts is an added advantage.
Benefits and Perks:
  • Competitive salary and bonus structure.
  • Comprehensive health and wellness benefits.
  • Opportunities for professional development and growth.
  • Flexible work arrangements, including remote work options.
  • Employee recognition programs and a collaborative team environment.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Oracle Data Integrator (ODI) ETL Developer

Ajman, Ajman Dicetek LLC

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Technical Expertise
Core Expertise In Oracle Data Integrator (ODI)
  • Proficiency in designing and implementing data integration solutions using ODI 12c or later versions.
  • Strong knowledge of:
  • Knowledge Modules (KM): Creating and customizing reusable components for data integration processes.
  • Mappings, Packages, and Scenarios: Designing complex data flows, workflows, and automating ETL processes.
  • Topology Configuration: Configuring physical and logical schemas, data servers, and agents.
  • Expertise in ODI's ELT framework and knowledge of performance optimization techniques.
  • Experience in error handling, reconciliation frameworks, and debugging ODI processes.
Database Skills
  • Strong SQL and PL/SQL skills for querying, transforming, and managing data.
  • Deep knowledge of Oracle Database concepts, including:
  • Indexing, partitioning, and materialized views.
  • Query optimization techniques.
Familiarity with other databases like SQL Server, MySQL, or PostgreSQL is advantageous
Data Warehousing And ETL Concepts
  • In-depth understanding of:
  • Dimensional modeling (Star and Snowflake schemas).
  • Fact and dimension tables, surrogate keys, slowly changing dimensions (SCDs), and hierarchies.
  • Data profiling, data cleansing, and data validation practices.
  • Hands-on experience in building and maintaining data warehouses and data lakes.
Experience And Professional Expertise
  • 5+ years of hands-on experience in developing and managing ETL/ELT processes using ODI.
  • Proven experience in implementing large-scale, end-to-end data integration projects, including:
  • Requirements gathering
  • Data modeling
  • Designing scalable ETL/ELT solutions.
  • Experience in upgrading or migrating ODI environments (e.g., from ODI 11g to 12c).

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Pipelines Jobs