184 Data Engineers jobs in the United Arab Emirates
Informatica Data Engineers
Posted today
Job Viewed
Job Description
Join to apply for the Informatica Data Engineers role at Dicetek LLC.
Experience: 5-8 Years
Summary:Experienced Informatica PC and IDMC Developer with 5+ years in data engineering. Skilled in designing end-to-end data pipelines using PC/IDMC for effective migration and transformation across cloud platforms. The ideal candidate will have in-depth knowledge of Informatica PowerCenter and IDMC. A strong foundation in Data Warehousing concepts and proficiency in Snowflake and SQL is essential. Strong team player with agile experience, delivering timely, high-impact data solutions.
Technical Skills- Tools: Informatica Cloud Data Integration, Informatica PowerCenter
- Data Warehousing: Snowflake, DataLake
- Programming: SQL, Python, Shell Scripting
- Data Management: Storage management, quality monitoring, governance
- Modeling: Dimensional modeling, star/snowflake schema
- Design, develop, and optimize ETL workflows using Informatica PowerCenter (PC) and Informatica IDMC.
- Manage data ingestion processes from diverse data sources such as Salesforce, Oracle databases, PostgreSQL, and MySQL.
- Implement and maintain ETL processes and data pipelines to ensure efficient data extraction, transformation, and loading.
- Utilize Snowflake as the data warehouse solution for managing large volumes of structured and unstructured data.
- Maintain and optimize ETL jobs for performance and reliability, ensuring timely data availability for business users.
- Support data migration, data integration, and data consolidation efforts.
- Write and maintain basic Python scripts for data processing and automation tasks.
- Utilize Unix shell commands for data-related tasks and system management.
- Troubleshoot and resolve ETL-related issues, ensuring data integrity and availability.
- Ensure adherence to best practices for data governance and security.
- Informatica Developer
- Developed ELT processes using PC/IDMC to integrate data into Snowflake.
- Implemented storage management for Azure Blob and Snowflake, enhancing data security.
- Worked on basic Python and shell scripting languages.
- Not Applicable
- Contract
- Information Technology
- IT Services and IT Consulting
Referrals increase your chances of interviewing at Dicetek LLC by 2x.
#J-18808-LjbffrBig Data Engineer
Posted today
Job Viewed
Job Description
Role: Senior Lead Software Engineer
Skill: Big Data Engineer
Experience: 7 Years
Strong functional knowledge and experience in the banking domain.
Develop and optimize data pipelines using Spark, Hive, and Python on Cloudera.
Develop real-time data workflows using Kafka.
Design and develop APIs for data access and integration.
Utilize Hue, Oozie, and other Cloudera tools for job orchestration and data access.
Deploy solutions on cloud platforms such as AWS, Azure, or GCP.
Required Skills and Experience:
Over 7 years of experience in Big Data engineering.
Hands-on experience with Cloudera Spark, Hive, Kafka, Python, Hue, and Ranger.
Strong understanding of distributed systems and cloud data services.
Proficient in API development and data security controls.
About Virtusa:
Join Virtusa and gain international experience working on leading Digital Transformation programs in the Middle East.
Virtusa is a rapidly growing IT services company with a presence in the UAE, KSA, Qatar, and Oman, working with top clients in banking, finance, travel, telecom, and enterprise sectors. We have received awards from Gartner, IDC, WfMC, and others for our exceptional work.
Be part of our award-winning team that values teamwork, quality of life, and professional growth. Join a global community of 30,000 professionals committed to your development, working on exciting projects with cutting-edge technologies.
#J-18808-LjbffrBig Data Engineer
Posted today
Job Viewed
Job Description
Apt Resources is seeking an experiencedBig Data Engineer for agovernment client in Abu Dhabi . You will design and implement large-scale data solutions to support AI/ML initiatives and public sector digital transformation.
Key Responsibilities:Data Pipeline Development :
- Build robust data pipelines usingPython SQL/NoSQL and Airflow
- DevelopETL/ELT processes for structured/unstructured data
- Managedata lakes and optimize storage solutions
Data Infrastructure :
- Design efficientdata models for analytics
- Implementdata governance and quality frameworks
- Work withcloud-based data platforms (Azure preferred)
AI/ML Support :
- Prepare and process datasets for machine learning applications
- Collaborate with ML teams on feature engineering
- 10-12 years of hands-on big data experience
- Expertise in:
- Python andSQL/NoSQL databases
- Airflow for workflow orchestration
- ETL/ELT pipeline development
- Cloud data platforms (Azure AWS or GCP)
To be discussed
#J-18808-LjbffrBig Data Architect
Posted today
Job Viewed
Job Description
We are seeking a seasoned Senior Big Data Management Developer to join our team.
This is a challenging and rewarding role that requires a strong understanding of big data technologies and the ability to design, develop, and implement ETL solutions using Informatica BDM.
The ideal candidate will have a proven track record of working with Informatica BDM and big data technologies, as well as excellent problem-solving skills and attention to detail.
- Designing, developing, and implementing ETL solutions using Informatica BDM
- Collaborating with cross-functional teams to gather and understand requirements
- Optimizing data workflows and processes for maximum efficiency
- Troubleshooting and resolving data integration issues
Requirements:
- Bachelor's degree in Computer Science or related field
- Proven experience working with Informatica BDM and big data technologies
- Strong understanding of data management principles
- Excellent problem-solving skills and attention to detail
About Us:
We are a leading recruitment agency specializing in connecting top talent with leading organizations across various industries.
Big Data Developer
Posted today
Job Viewed
Job Description
Apache Spark: Strong experience in batch data processing, including RDD, Data Frames, Spark SQL, pySpark, and MLlib; streaming experience is an added advantage.
Cloud & Big Data Platforms: Experience with Databricks, Oracle DRCC, Hadoop, Hive, HBase, Pig and other big data technologies
Data Caching & Performance Optimization: Experience in improving data retrieval and processing performance.
Programming Skills: Strong in Java and Scala (Scala preferred).
Database Experience: Exposure to HDFS, Data Lake, PostgreSQL/MS SQL/SQL, and similar databases.
Data Security & Compliance: Experience with PII data handling, data masking, encryption, decryption, and related security measures.
Workflow Orchestration: Hands-on experience with Apache Airflow or Nifi.
Communication: Strong verbal and written communication skills, with the ability to collaborate across teams and explain technical concepts to non-technical stakeholders.
Secondary Skills
Apache Kafka: Basic hands-on experience is sufficient.
Deployment & Release Management: Experience in application deployment and production release processes.
CI/CD Pipelines : Exposer in designing, implementing, and managing continuous integration and continuous deployment pipelines using tools such as Jenkins, ArgoCD, and Azure CI/CD.
Data Ingestion Tools: Exposer with tools such as Apache Flume, Sqoop, or similar for large-scale data ingestion.
Apply :
Seniority level- Seniority level Mid-Senior level
- Employment type Contract
- Job function Engineering and Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Huns Info by 2x
Get notified about new Big Data Developer jobs in Dubai, United Arab Emirates.
Dubai, Dubai, United Arab Emirates 22 hours ago
Dubai, Dubai, United Arab Emirates 8 hours ago
Dubai, Dubai, United Arab Emirates 1 year ago
We're unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBig Data Strategist
Posted today
Job Viewed
Job Description
Are you passionate about working with big data and innovative technologies?
Opportunity OverviewThis is an incredible opportunity to work with a prestigious financial institution, utilizing cutting-edge big data projects.
Key Responsibilities & Requirements- 7-10 years of experience as a Big Data Engineer with expertise in Hadoop (Cloudera), Apache Spark, and similar frameworks.
- Proficient in Big Data querying tools: Pig, Hive, Impala, with strong coding skills in Python, Java, C, Linux, Ruby, PHP, or R.
- Skill in solving complex networking, data, and software challenges.
- Capable of planning, organizing, and delivering projects effectively, with excellent team player skills and strong interpersonal communication skills.
You will enjoy the flexibility of a fully remote role and gain exposure to leading-edge big data projects.
Big Data Architect
Posted today
Job Viewed
Job Description
Total of 15 years of experience in architecture design of systems and 10 years of experience in architecture design of Big Data Platforms with the following skills:
- Proven experience in designing Big Data Platforms for both Infrastructure and Software Implementation.
- Good understanding of architecture Design Stage .
Responsibilities include:
- Preparing detailed deployment architecture documents of the Enterprise Big Data Platform (redesign of existing architecture).
- Supporting CART review of the solution architecture of the Enterprise Big Data Platform.
- Coordinating with Big Data Platform Technical specifications related to the Enterprise Big Data Platform infrastructure software stack, etc.
Be The First To Know
About the latest Data engineers Jobs in United Arab Emirates !
Senior Big Data Engineer
Posted today
Job Viewed
Job Description
**Data Engineering Role Summary**
- Achieve business value by designing, constructing, and managing data pipelines for ingestion, transformation, storage, and analytics.
Our ideal candidate will have a deep understanding of data architecture, processing, and governance principles to ensure the delivery of high-quality data products. Key responsibilities include:
- Designing and developing scalable data architectures using Azure Databricks, Data Lake Storage, and other cloud-based technologies.
- Ensuring data quality, integrity, security, and compliance throughout the data lifecycle.
- Collaborating with cross-functional teams to deliver end-to-end data solutions that meet business needs.
- Minimum 5 years of experience in a data engineering or data platform role, preferably in a cloud computing environment.
- Strong expertise in SQL, Python, PySpark, Jupyter Notebooks, and related data processing frameworks.
- Experience working with ETL/ELT tools such as Azure Data Factory, Azure Synapse, or similar platforms.
- Excellent problem-solving skills, with the ability to analyze complex technical issues and implement effective solutions.
- Strong communication and collaboration skills to work effectively with both technical and non-technical stakeholders.
- Cloud Computing (Azure)
- Big Data Technologies (Databricks, Data Lake Storage)
- Programming Languages (Python, SQL)
- ETL/ELT Tools (Data Factory, Synapse)
Senior Big Data Engineer
Posted today
Job Viewed
Job Description
Are you an experienced data professional seeking a challenging role in a dynamic environment? We have an exciting opportunity for a skilled Data Engineer to join our team. As a key member of our data engineering group, you will be responsible for designing, developing, and maintaining large-scale data pipelines that meet the organization's business needs.
Key Responsibilities:- Design and develop highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform
- Implement and manage data ingestion processes from various sources to the data lake or data warehouse
- Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs
- Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes
- Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field
- 3+ years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform
- Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques
- Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase
- A competitive salary and benefits package
- Opportunities for career growth and professional development
- A collaborative and dynamic work environment
Big Data Management Professional
Posted today
Job Viewed
Job Description
We are looking for an experienced professional in big data management to join our team. The ideal candidate will have a strong background in big data technologies and excellent problem-solving skills.