62 Data Pipelines jobs in the United Arab Emirates
Senior Manager – Data & AI Architecture
Posted 1 day ago
Job Viewed
Job Description
Role Description: Data Architecture Strategy & Design: • Define and implement enterprise-wide data architecture strategies, ensuring scalability, performance, and security. • Develop data models, pipelines, and storage solutions that support structured, semi-structured, and unstructured data. • Ensure alignment between data architecture, analytics, AI/ML models, and business intelligence (BI) platforms. • Drive the adoption of data lakehouse, cloud data platforms (Azure Synapse, AWS Redshift, Google BigQuery), and distributed computing frameworks. • Work closely with data engineering and analytics teams to optimize data pipelines and transformations. Cloud & Big Data Ecosystem Integration: • Lead the design and implementation of cloud-based data solutions, ensuring seamless integration with enterprise applications. • Define best practices for data ingestion, ETL/ELT, and real-time data streaming (Kafka, Spark, Databricks, etc.). • Ensure interoperability between on-premise, hybrid, and multi-cloud data environments. • Optimize data storage, processing, and retrieval across cloud platforms, improving cost efficiency and performance. Data Governance & Compliance: • Implement data governance frameworks, ensuring compliance with GDPR, HIPAA, ISO 27001, and other regulatory standards. • Define metadata management, data lineage, and data cataloging best practices. • Work with security teams to enforce data access controls, encryption, and role-based access management (RBAC). • Ensure data quality and integrity across the organization’s analytical and AI platforms. Data Integration & Interoperability: • Develop and maintain enterprise data integration frameworks, enabling smooth data exchange between systems. • Lead API-driven data integrations, event-driven architectures, and message queues for real-time data movement. • Establish data fabric and data mesh approaches, enabling a scalable, decentralized data ecosystem. • Work closely with business units, AI teams, and application developers to provide accessible, high-quality data. AI & Advanced Analytics Enablement: • Support AI and ML model development by ensuring high-quality data availability. • Define data architecture principles that optimize AI pipelines and feature engineering processes. • Work with AI engineers to deploy and operationalize AI models in cloud and edge environments. • Ensure efficient data versioning, feature stores, and reproducibility frameworks for AI workflows. Leadership & Team Development: • Lead a team of data architects, data modelers, and data integration specialists. • Foster a culture of data-driven decision-making and continuous learning. • Develop training programs and best practices for data architecture governance, performance optimization, and security. • Collaborate with cross-functional teams, including Data Science, Engineering, and Business Intelligence teams.
• Bachelor’s or master’s degree in computer science, Data Engineering, or related field. • 10+ years of experience in data architecture, big data engineering, or cloud data solutions. • Proven expertise in Azure, AWS, and/or Google Cloud-based data ecosystems (with certifications, Azure required). • Experience with data modeling tools, metadata management, and data governance platforms. • Strong understanding of data mesh, data fabric, and decentralized data architectures. • Background in leading and managing data architecture teams in enterprise environments. • Familiarity with BI/analytics tools such as Power BI, Tableau, or Looker. • Experience in supporting AI/ML workflows with structured, high-quality data pipelines
#J-18808-LjbffrSr. Manager – Data & AI Architecture
Posted 1 day ago
Job Viewed
Job Description
2 weeks ago Be among the first 25 applicants
Get AI-powered advice on this job and more exclusive features.
Direct message the job poster from Solutions+ (A Mubadala company)
To design and implement enterprise data architecture, governance, and integration frameworks that support both internal data operations and client-facing solutions. This role is integral in ensuring scalable, secure, and high-performing data architectures that align with business intelligence, analytics, and AI-driven initiatives. To drive best practices in data modeling, cloud data platforms, data lakes, and real-time data processing while working closely with engineering, AI, and business teams to enable data-driven decision-making. Additionally, this role will serve as an AI Architect, guiding AI driven solutions by aligning Ai strategy with enterprise goals.
Role Description
- Data Architecture Strategy & Design
- Data Governance & Compliance
- AI & Advanced Analytics Enablement
- Leadership & Team Development
Job Specific Knowledge and Skills:
- Expertise in enterprise data architecture, cloud data platforms, and big data technologies.
- Strong knowledge of data modeling, data lakes, data warehouses, and real-time processing architectures.
- Proven ability to design scalable, secure, and high-performance data solutions.
- Experience in data governance, metadata management, and regulatory compliance.
- Strong background in ETL/ELT processes, API-based data integrations, and event-driven architectures.
- Ability to optimize AI/ML pipelines through structured data architecture frameworks.
- Proficiency in SQL, Python, Spark, and cloud-based data engineering tools.
- Strong leadership in team development, cross-functional collaboration, and stakeholder engagement.
- Ability to drive innovation in data management, enabling AI and advanced analytics capabilities.
- Experience in cost optimization and performance tuning for cloud-based data solutions.
Preferred Qualifications & Experience:
- Bachelor’s or Masters degree in Computer Science, Data Engineering, or related field.
- 10+ years of experience in data architecture, big data engineering, or cloud data solutions.
- Proven expertise in Azure, AWS, and/or Google Cloud-based data ecosystems (with certifications, Azure required).
- Experience with data modeling tools, metadata management, and data governance platforms.
- Strong understanding of data mesh, data fabric, and decentralized data architectures.
- Background in leading and managing data architecture teams in enterprise environments.
- Familiarity with BI/analytics tools such as Power BI, Tableau, or Looker.
- Experience in supporting AI/ML workflows with structured, high-quality data pipelines.
- Seniority levelMid-Senior level
- Employment typeFull-time
- Job functionEngineering, Information Technology, and Other
- IndustriesIT Services and IT Consulting, Business Consulting and Services, and Technology, Information and Media
Referrals increase your chances of interviewing at Solutions+ (A Mubadala company) by 2x
Sign in to set job alerts for “Manager of Artificial Intelligence” roles.Abu Dhabi Emirate, United Arab Emirates 3 weeks ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 days ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 6 months ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 days ago
Senior Business Development Manager – AI & IT SolutionsAbu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago
Corporate and Performance Reporting ManagerAbu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 days ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 months ago
Business Development Manager - Digitalisation/ AI SolutionsAbu Dhabi Emirate, United Arab Emirates 2 weeks ago
Abu Dhabi Emirate, United Arab Emirates 1 week ago
Product Manager - Healthcare AI SolutionsAbu Dhabi Emirate, United Arab Emirates 3 weeks ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrSenior Manager – Data & AI Architecture
Posted 21 days ago
Job Viewed
Job Description
Role Description: Data Architecture Strategy & Design: • Define and implement enterprise-wide data architecture strategies, ensuring scalability, performance, and security. • Develop data models, pipelines, and storage solutions that support structured, semi-structured, and unstructured data. • Ensure alignment between data architecture, analytics, AI/ML models, and business intelligence (BI) platforms. • Drive the adoption of data lakehouse, cloud data platforms (Azure Synapse, AWS Redshift, Google BigQuery), and distributed computing frameworks. • Work closely with data engineering and analytics teams to optimize data pipelines and transformations. Cloud & Big Data Ecosystem Integration: • Lead the design and implementation of cloud-based data solutions, ensuring seamless integration with enterprise applications. • Define best practices for data ingestion, ETL/ELT, and real-time data streaming (Kafka, Spark, Databricks, etc.). • Ensure interoperability between on-premise, hybrid, and multi-cloud data environments. • Optimize data storage, processing, and retrieval across cloud platforms, improving cost efficiency and performance. Data Governance & Compliance: • Implement data governance frameworks, ensuring compliance with GDPR, HIPAA, ISO 27001, and other regulatory standards. • Define metadata management, data lineage, and data cataloging best practices. • Work with security teams to enforce data access controls, encryption, and role-based access management (RBAC). • Ensure data quality and integrity across the organization’s analytical and AI platforms. Data Integration & Interoperability: • Develop and maintain enterprise data integration frameworks, enabling smooth data exchange between systems. • Lead API-driven data integrations, event-driven architectures, and message queues for real-time data movement. • Establish data fabric and data mesh approaches, enabling a scalable, decentralized data ecosystem. • Work closely with business units, AI teams, and application developers to provide accessible, high-quality data. AI & Advanced Analytics Enablement: • Support AI and ML model development by ensuring high-quality data availability. • Define data architecture principles that optimize AI pipelines and feature engineering processes. • Work with AI engineers to deploy and operationalize AI models in cloud and edge environments. • Ensure efficient data versioning, feature stores, and reproducibility frameworks for AI workflows. Leadership & Team Development: • Lead a team of data architects, data modelers, and data integration specialists. • Foster a culture of data-driven decision-making and continuous learning. • Develop training programs and best practices for data architecture governance, performance optimization, and security. • Collaborate with cross-functional teams, including Data Science, Engineering, and Business Intelligence teams.
Requirements
• Bachelor’s or master’s degree in computer science, Data Engineering, or related field. • 10+ years of experience in data architecture, big data engineering, or cloud data solutions. • Proven expertise in Azure, AWS, and/or Google Cloud-based data ecosystems (with certifications, Azure required). • Experience with data modeling tools, metadata management, and data governance platforms. • Strong understanding of data mesh, data fabric, and decentralized data architectures. • Background in leading and managing data architecture teams in enterprise environments. • Familiarity with BI/analytics tools such as Power BI, Tableau, or Looker. • Experience in supporting AI/ML workflows with structured, high-quality data pipelines
About the company
Solutions+, is a wholly owned subsidiary of Mubadala Investment Company. Established 10 years ago, we are the leading UAE shared services company offering a range of solutions from finance, human resources, IT, procurement, facilities, and sustainability. Solutions+ portfolio of brands cover various service sectors across the UAE including sports and entertainment, in addition to business processes and ESG. Our vision is to drive value, for our clients and our nation, by providing world-class business performance solutions. Leveraging our deep knowledge and expertise, sustainable processes, and cutting-edge technologies, we offer direct management and counsel across vital infrastructure functions, from operations to digital services.
Data & Integration Specialist
Posted today
Job Viewed
Job Description
GDI takes pride in creating opportunities for personal growth and achievement. To apply for a job opening, please email your latest CV in MS Word or PDF format to , with the job title applied for in the subject line.
Job DescriptionWe are actively seeking a Data & Integration Specialist with expertise in Oracle Database to join our team. In this role, you will be responsible for the creation, implementation, and maintenance of data models essential for supporting the integration, reporting, and analytics functions of the organization. Your responsibilities will involve close collaboration with business stakeholders, data analysts, and IT teams to understand data requirements and transform them into efficient and robust data models.
Responsibilities- Collaborate with business stakeholders to comprehensively understand integration, reporting, and analytics data requirements and objectives.
- Design and develop conceptual, logical, and physical data models in alignment with Oracle Database standards.
- Exhibit proficiency in translating business requirements into precise data models.
- Work closely with Oracle Database administrators and SAP specialists to ensure seamless alignment of data models with integration needs.
- Collaborate with ETL developers and data engineers to ensure smooth data integration across various databases, SAP, and other relevant systems.
- Utilize programming languages such as .NET, Node.js, Python, etc., for REST-based service implementation (desirable).
- Preference for experience with middleware technologies such as SoapUI, Postman, and other middleware.
- Bachelor’s degree in computer science, Information Technology, or a related field.
- 3-5 years of experience as a data modeler and integration expert, including expertise with REST/SOAP-based services.
- Proficient knowledge of Oracle Database architecture, data structures, SQL, and hands-on experience with SAP data environments is preferred.
- Exceptional analytical and problem-solving skills, with a proven ability to translate business requirements into effective data models within Oracle Database and SAP landscapes.
- Strong communication and interpersonal abilities, enabling effective collaboration with stakeholders at all organizational levels, including Oracle Database administrators, SAP specialists, and business users.
- Capacity to thrive both independently and as part of a team in a fast-paced, dynamic work environment.
Data & Integration Specialist
Posted 1 day ago
Job Viewed
Job Description
GDI takes pride in creating opportunities for personal growth and achievement. To apply for a job opening, please email your latest CV in MS Word or PDF format to , with the job title applied for in the subject line.
Job DescriptionWe are actively seeking a Data & Integration Specialist with expertise in Oracle Database to join our team. In this role, you will be responsible for the creation, implementation, and maintenance of data models essential for supporting the integration, reporting, and analytics functions of the organization. Your responsibilities will involve close collaboration with business stakeholders, data analysts, and IT teams to understand data requirements and transform them into efficient and robust data models.
Responsibilities- Collaborate with business stakeholders to comprehensively understand integration, reporting, and analytics data requirements and objectives.
- Design and develop conceptual, logical, and physical data models in alignment with Oracle Database standards.
- Exhibit proficiency in translating business requirements into precise data models.
- Work closely with Oracle Database administrators and SAP specialists to ensure seamless alignment of data models with integration needs.
- Collaborate with ETL developers and data engineers to ensure smooth data integration across various databases, SAP, and other relevant systems.
- Utilize programming languages such as .NET, Node.js, Python, etc., for REST-based service implementation (desirable).
- Preference for experience with middleware technologies such as SoapUI, Postman, and other middleware.
- Bachelor’s degree in computer science, Information Technology, or a related field.
- 3-5 years of experience as a data modeler and integration expert, including expertise with REST/SOAP-based services.
- Proficient knowledge of Oracle Database architecture, data structures, SQL, and hands-on experience with SAP data environments is preferred.
- Exceptional analytical and problem-solving skills, with a proven ability to translate business requirements into effective data models within Oracle Database and SAP landscapes.
- Strong communication and interpersonal abilities, enabling effective collaboration with stakeholders at all organizational levels, including Oracle Database administrators, SAP specialists, and business users.
- Capacity to thrive both independently and as part of a team in a fast-paced, dynamic work environment.
#J-18808-Ljbffr
Expertise in Data Integration
Posted today
Job Viewed
Job Description
The role of a SAP BODS Professional involves working with our team to design and implement data services solutions using Business Objects Data Services (BODS). The ideal candidate will have a strong background in BODS, experience in preparing technical specification documents, and a solid understanding of developing real-time interfaces like IDOC, BAPI SAP applications.
Key Responsibilities:- Design and implement data services solutions using BODS.
- Prepare technical specification documents for data services projects.
- Develop real-time interfaces like IDOC, BAPI SAP applications using BODS.
- Analyze and fix ETL issues independently.
- Install and configure BODS.
- Understand and manage BODS landscape and architecture.
- Estimate BODS work effort.
- More than 5 years of experience in BODS design and architecture.
- Experience in preparing Technical Specification documents.
- Strong hands-on experience in Business Objects Data Services (BODS) as a technical developer.
- Thorough knowledge of developing real-time interfaces like IDOC, BAPI SAP applications with BODS.
- Solid knowledge of SQL/PLSQL.
- Experience in Data Migration Projects with end-to-end implementation.
- Good analytical skills to independently analyze and fix ETL issues.
- Experience in installation and configuration of BODS.
- Knowledge of various transformations in BODS.
- Understanding of BODS landscape and architecture.
- Ability to estimate BODS work effort.
- Knowledge of AWS Glue, Amazon Athena, DBT, Dagster is a plus.
- Seniority level: Not Applicable
- Employment type: Contract
- Job function: Information Technology
- Industries: IT Services and IT Consulting
Senior Data Engineer - Systems Architecture, Design & Development
Posted 1 day ago
Job Viewed
Job Description
General Purpose
We are seeking an experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, a passion for innovation, and the technical expertise to design and implement robust, scalable, and efficient data systems. As a Grade 2 Data Engineer, you will focus on designing, optimizing, and maintaining complex data pipelines and workflows while collaborating with cross-functional teams to support data-driven decision-making across the organization. This role requires advanced technical skills and a commitment to delivering high-quality solutions
Nature & Scope
Essential Duties & Tasks
- Lead architecture development and solution design sessions, contributing strategic insights to ensure robust, scalable, and efficient data systems
- Collaborate with architects and cross-functional teams to design and implement end-to-end data solutions that address complex business needs
- Develop advanced data assets, including highly scalable ETL/ELT pipelines, optimized data structures, and dynamic data workflows on Wynn’s cloud infrastructure
- Drive data quality initiatives, including sophisticated data profiling, cleansing, and anomaly detection, leveraging advanced tools and methodologies
- Establish, enforce, and continuously refine data quality standards while designing automated processes for issue identification and resolution
- Act as a technical leader within the team, mentoring junior engineers, conducting detailed code reviews, and setting standards for best practices
- Analyze and translate complex business requirements into detailed technical specifications, ensuring solutions align with broader organizational goals
- Define, design, and implement advanced data ingestion patterns (batch, real-time streaming, and hybrid) tailored to diverse use cases and business requirements
- Lead data modeling initiatives, developing and optimizing both logical and physical data models to support high-performance analytics and operations
- Collaborate closely with stakeholders to design and deliver data models that meet advanced operational, analytical, and reporting needs
- Drive data governance initiatives by contributing to the design and implementation of robust security, compliance, and data privacy frameworks
- Ensure data integrity and quality by embedding validation, anomaly detection, and self-healing mechanisms throughout data workflows
- Create comprehensive technical documentation, including data flows, pipeline designs, and operational runbooks to support system maintainability
- Evaluate, recommend, and implement cutting-edge data ingestion, integration, and replication tools to enhance the efficiency of cloud-based analytics systems
- Lead proof-of-concept projects to assess and introduce innovative data engineering technologies, tools, and frameworks
- Design and implement highly reliable real-time data streaming solutions using platforms such as Azure Event Hubs, Kafka, or AWS Kinesis to meet time-sensitive business needs
- Develop and maintain complex, reusable ETL/ELT processes that can handle large-scale, dynamic datasets efficiently and securely.
- Own the maintenance and performance optimization of data ingestion tools, ensuring they meet high availability and reliability standards
- Continuously optimize existing data pipelines, identifying and resolving performance bottlenecks to improve scalability and cost-efficiency
- Provide advanced support to analytics and data science teams by curating high-quality, well-documented, and accessible datasets tailored to their needs
- Lead efforts to monitor and proactively troubleshoot data pipelines, implementing automated alerts and self-healing mechanisms to minimize downtime
- Research and propose strategies for adopting emerging trends in data engineering, driving innovation and process improvements.
- Implement and manage robust CI/CD workflows to automate pipeline deployment, testing, and version control for seamless operations
- Regularly upgrade and enhance data ingestion tools, ensuring system resilience and alignment with the latest industry best practices
- Contribute to cross-functional projects, demonstrating expertise in serverless architectures, API integration (e.g., FastAPI), and scalable cloud solutions
- Drive knowledge-sharing initiatives, such as training sessions and technical presentations, to elevate the team’s overall capabilities and expertise
- A Bachelor’s degree in computer science, information technology, or a related field is required; a Master’s degree is preferred but not mandatory
- Minimum age 21
- 5 to 7 years of hands-on experience in data engineering, demonstrating consistent career progression and technical growth
- Proven ability to design, develop, and deploy highly scalable and efficient data solutions for complex business needs
- Extensive experience managing and optimizing complex data integrations across diverse systems, platforms, and cloud environments
- Advanced proficiency in programming languages such as Python, SQL, and Shell Scripting, with the ability to implement optimized and scalable code solutions
- Deep expertise in data platforms like Snowflake and Databricks, including extensive experience working with PySpark and distributed dataframes to process large-scale datasets
- Advanced knowledge of orchestration tools such as Azure Data Factory, Apache Airflow, and Databricks workflows, including the ability to design and manage complex, multi-step workflows
- Significant hands-on experience with tools like DBT for data transformation and replication solutions such as Qlik for efficient data migration and synchronization
- Strong understanding of big data systems and frameworks, with practical experience in building and optimizing solutions for high-volume and high-velocity data
- Extensive experience with version control tools such as GitHub or Azure DevOps, including implementing CI/CD pipelines for data engineering workflows
- Advanced knowledge of serverless computing, including designing and deploying scalable solutions using Azure Functions with Python
- Proficiency in API development frameworks such as FastAPI, with the ability to create robust, efficient, and secure data-driven APIs
- Comprehensive expertise in designing and implementing ETL/ELT processes with a focus on performance, scalability, and maintainability
- Proven experience in data warehouse development, including hands-on expertise with dimensional modeling and schema optimization for analytics
- Solid English language communication skills, both written and verbal, for effective collaboration across teams and stakeholders
- Snowflake SnowPro Core Certification
- Databricks Certified Data Engineer Professional
- Microsoft Azure Data Engineer Associate
This is an office-based position with regular working hoursSeniority level
- Seniority levelMid-Senior level
- Employment typeFull-time
- Job functionInformation Technology
- IndustriesHospitality
Referrals increase your chances of interviewing at Wynn Al Marjan Island by 2x
Get notified about new Data System Engineer jobs in Ras al-Khaimah, United Arab Emirates.
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBe The First To Know
About the latest Data pipelines Jobs in United Arab Emirates !
Senior ETL Developer
Posted 1 day ago
Job Viewed
Job Description
Join to apply for the Senior ETL Developer role at Dicetek LLC
Continue with Google Continue with Google
Join to apply for the Senior ETL Developer role at Dicetek LLC
Get AI-powered advice on this job and more exclusive features.
Sign in to access AI-powered advicesContinue with Google Continue with Google
Continue with Google Continue with Google
Continue with Google Continue with Google
Continue with Google Continue with Google
Continue with Google Continue with Google
Continue with Google Continue with Google
Job Description
We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. The ideal candidate will have extensive experience with Informatica BDM and Databricks pipeline, along with strong knowledge of SQL and PowerShell. The candidate should be proficient in designing ETL workflows and possess excellent communication skills. An understanding of data modeling and DAX queries is a plus.
Job Description
We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. The ideal candidate will have extensive experience with Informatica BDM and Databricks pipeline, along with strong knowledge of SQL and PowerShell. The candidate should be proficient in designing ETL workflows and possess excellent communication skills. An understanding of data modeling and DAX queries is a plus.
Key Responsibilities
- Design, develop, and maintain ETL processes using Informatica BDM and Databricks pipeline.
- Collaborate with data architects and business analysts to understand data requirements and translate them into ETL solutions.
- Optimize and troubleshoot ETL workflows to ensure high performance and reliability.
- Develop and maintain scripts using Oracle SQL and PowerShell for data extraction, transformation, and loading.
- Ensure data quality and integrity throughout the ETL process.
- Document ETL processes, workflows, and data mappings.
- Communicate effectively with team members, stakeholders, and management to provide updates and gather requirements.
- Utilize data modeling techniques and DAX queries to enhance data analysis and reporting capabilities.
- Leverage Azure services and tools to support ETL processes and data integration.
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Minimum of 7 years of experience in ETL development.
- Strong experience with Informatica BDM and Databricks pipeline.
- Proficient in Oracle SQL and PowerShell.
- Experience in designing and optimizing ETL workflows.
- Excellent problem-solving skills and attention to detail.
- Strong communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Understanding of data modeling and DAX queries is an added advantage.
- Experience with Azure services and tools.
- Knowledge of data warehousing concepts and best practices.
- Certification in Informatica, Databricks, or Azure is a plus.
- Seniority levelNot Applicable
- Employment typeContract
- Job functionBusiness Development and Sales
- IndustriesIT Services and IT Consulting
Referrals increase your chances of interviewing at Dicetek LLC by 2x
Sign in to set job alerts for “Senior ETL Developer” roles.Continue with Google Continue with Google
Continue with Google Continue with Google
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED3,000 - AED7,000 2 days ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 9 months ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 6 months ago
Abu Dhabi Emirate, United Arab Emirates 1 week ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 hours ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago
Abu Dhabi Emirate, United Arab Emirates 3 weeks ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 11 hours ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates
AED12,000.00
-
AED13,000.00
3 months ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 7 hours ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 9 hours ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 day ago
Systems Engineer - Avionics Software DevelopmentAbu Dhabi, Abu Dhabi Emirate, United Arab Emirates 5 days ago
Abu Dhabi Emirate, United Arab Emirates 2 weeks ago
Software Developer ( .NET core, Angular)Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 2 months ago
Full Stack Java Developer - Banking (m/f/d)Abu Dhabi Emirate, United Arab Emirates 1 day ago
Front-End Developer - React Native and React (Web)Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago
API Developer-Terraform, Azure, and API Gateway for MicroservicesAbu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED14,000 - AED15,000 3 months ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 1 month ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates AED14,000 - AED18,000 3 months ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 4 months ago
Abu Dhabi Emirate, United Arab Emirates 2 weeks ago
Abu Dhabi Emirate, United Arab Emirates 3 weeks ago
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates 19 hours ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrETL Developer - SAP BODS
Posted 1 day ago
Job Viewed
Job Description
In this role, you will have the opportunity to work closely with one of our esteemed clients. This client is a global leader in the Logistics Industry known for its commitment to quality and innovation. They have chosen Dautom as their trusted partner for their upcoming projects.
Job Title: ETL Developer - SAP BODS
Responsibilities:- Extract Data from External Sources/Web services and store into Data Lake.
- Apply Transformations to prepare data by extracting from Source System Databases like Oracle, SQL Server.
- Performing Data Quality Checks, Integration and cleansing of data before loading into data warehouses.
- Proficient knowledge in Oracle SQL is mandatory.
- Proficiency in Oracle/SQL to write Dynamic SQL Queries to extract data.
- Good Understanding of structured/unstructured architecture data modelling concepts.
- Skilled in the complete lifecycle of SAP BODS/BODI development, including analysis, design, testing, implementation and support as well as providing regular updates.
- Responsible for delivering analyses for functional and business requirements.
- Strong written and verbal communications skills.
- Ability to recognize and clearly report relevant information.
- Bachelors degree in Computer Science, Computer Engineering, or any other relevant technical field.
- 5+ Years of Experience in designing Data Marts/Models using SAP BODS/BODI ETL (or) Informatica Tools.
- Knowledge in writing Python (or) R Scripts is an added advantage.
- Competitive salary and bonus structure.
- Comprehensive health and wellness benefits.
- Opportunities for professional development and growth.
- Flexible work arrangements, including remote work options.
- Employee recognition programs and a collaborative team environment.
#J-18808-Ljbffr
Senior Oracle Data Integrator (ODI) ETL Developer
Posted 1 day ago
Job Viewed
Job Description
Core Expertise In Oracle Data Integrator (ODI)
- Proficiency in designing and implementing data integration solutions using ODI 12c or later versions.
- Strong knowledge of:
- Knowledge Modules (KM): Creating and customizing reusable components for data integration processes.
- Mappings, Packages, and Scenarios: Designing complex data flows, workflows, and automating ETL processes.
- Topology Configuration: Configuring physical and logical schemas, data servers, and agents.
- Expertise in ODI's ELT framework and knowledge of performance optimization techniques.
- Experience in error handling, reconciliation frameworks, and debugging ODI processes.
- Strong SQL and PL/SQL skills for querying, transforming, and managing data.
- Deep knowledge of Oracle Database concepts, including:
- Indexing, partitioning, and materialized views.
- Query optimization techniques.
Data Warehousing And ETL Concepts
- In-depth understanding of:
- Dimensional modeling (Star and Snowflake schemas).
- Fact and dimension tables, surrogate keys, slowly changing dimensions (SCDs), and hierarchies.
- Data profiling, data cleansing, and data validation practices.
- Hands-on experience in building and maintaining data warehouses and data lakes.
- 5+ years of hands-on experience in developing and managing ETL/ELT processes using ODI.
- Proven experience in implementing large-scale, end-to-end data integration projects, including:
- Requirements gathering
- Data modeling
- Designing scalable ETL/ELT solutions.
- Experience in upgrading or migrating ODI environments (e.g., from ODI 11g to 12c).
#J-18808-Ljbffr