117 Big Data jobs in the United Arab Emirates
Big Data Architect
Posted today
Job Viewed
Job Description
We are seeking a seasoned Senior Big Data Management Developer to join our team.
This is a challenging and rewarding role that requires a strong understanding of big data technologies and the ability to design, develop, and implement ETL solutions using Informatica BDM.
The ideal candidate will have a proven track record of working with Informatica BDM and big data technologies, as well as excellent problem-solving skills and attention to detail.
- Designing, developing, and implementing ETL solutions using Informatica BDM
- Collaborating with cross-functional teams to gather and understand requirements
- Optimizing data workflows and processes for maximum efficiency
- Troubleshooting and resolving data integration issues
Requirements:
- Bachelor's degree in Computer Science or related field
- Proven experience working with Informatica BDM and big data technologies
- Strong understanding of data management principles
- Excellent problem-solving skills and attention to detail
About Us:
We are a leading recruitment agency specializing in connecting top talent with leading organizations across various industries.
Big Data Engineer
Posted today
Job Viewed
Job Description
Apt Resources is seeking an experiencedBig Data Engineer for agovernment client in Abu Dhabi . You will design and implement large-scale data solutions to support AI/ML initiatives and public sector digital transformation.
Key Responsibilities:Data Pipeline Development :
- Build robust data pipelines usingPython SQL/NoSQL and Airflow
- DevelopETL/ELT processes for structured/unstructured data
- Managedata lakes and optimize storage solutions
Data Infrastructure :
- Design efficientdata models for analytics
- Implementdata governance and quality frameworks
- Work withcloud-based data platforms (Azure preferred)
AI/ML Support :
- Prepare and process datasets for machine learning applications
- Collaborate with ML teams on feature engineering
- 10-12 years of hands-on big data experience
- Expertise in:
- Python andSQL/NoSQL databases
- Airflow for workflow orchestration
- ETL/ELT pipeline development
- Cloud data platforms (Azure AWS or GCP)
To be discussed
#J-18808-LjbffrBig Data Engineer
Posted today
Job Viewed
Job Description
Role: Senior Lead Software Engineer
Skill: Big Data Engineer
Experience: 7 Years
Strong functional knowledge and experience in the banking domain.
Develop and optimize data pipelines using Spark, Hive, and Python on Cloudera.
Develop real-time data workflows using Kafka.
Design and develop APIs for data access and integration.
Utilize Hue, Oozie, and other Cloudera tools for job orchestration and data access.
Deploy solutions on cloud platforms such as AWS, Azure, or GCP.
Required Skills and Experience:
Over 7 years of experience in Big Data engineering.
Hands-on experience with Cloudera Spark, Hive, Kafka, Python, Hue, and Ranger.
Strong understanding of distributed systems and cloud data services.
Proficient in API development and data security controls.
About Virtusa:
Join Virtusa and gain international experience working on leading Digital Transformation programs in the Middle East.
Virtusa is a rapidly growing IT services company with a presence in the UAE, KSA, Qatar, and Oman, working with top clients in banking, finance, travel, telecom, and enterprise sectors. We have received awards from Gartner, IDC, WfMC, and others for our exceptional work.
Be part of our award-winning team that values teamwork, quality of life, and professional growth. Join a global community of 30,000 professionals committed to your development, working on exciting projects with cutting-edge technologies.
#J-18808-LjbffrBig Data Developer
Posted today
Job Viewed
Job Description
Apache Spark: Strong experience in batch data processing, including RDD, Data Frames, Spark SQL, pySpark, and MLlib; streaming experience is an added advantage.
Cloud & Big Data Platforms: Experience with Databricks, Oracle DRCC, Hadoop, Hive, HBase, Pig and other big data technologies
Data Caching & Performance Optimization: Experience in improving data retrieval and processing performance.
Programming Skills: Strong in Java and Scala (Scala preferred).
Database Experience: Exposure to HDFS, Data Lake, PostgreSQL/MS SQL/SQL, and similar databases.
Data Security & Compliance: Experience with PII data handling, data masking, encryption, decryption, and related security measures.
Workflow Orchestration: Hands-on experience with Apache Airflow or Nifi.
Communication: Strong verbal and written communication skills, with the ability to collaborate across teams and explain technical concepts to non-technical stakeholders.
Secondary Skills
Apache Kafka: Basic hands-on experience is sufficient.
Deployment & Release Management: Experience in application deployment and production release processes.
CI/CD Pipelines : Exposer in designing, implementing, and managing continuous integration and continuous deployment pipelines using tools such as Jenkins, ArgoCD, and Azure CI/CD.
Data Ingestion Tools: Exposer with tools such as Apache Flume, Sqoop, or similar for large-scale data ingestion.
Apply :
Seniority level- Seniority level Mid-Senior level
- Employment type Contract
- Job function Engineering and Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Huns Info by 2x
Get notified about new Big Data Developer jobs in Dubai, United Arab Emirates.
Dubai, Dubai, United Arab Emirates 22 hours ago
Dubai, Dubai, United Arab Emirates 8 hours ago
Dubai, Dubai, United Arab Emirates 1 year ago
We're unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBig Data Strategist
Posted today
Job Viewed
Job Description
Are you passionate about working with big data and innovative technologies?
Opportunity OverviewThis is an incredible opportunity to work with a prestigious financial institution, utilizing cutting-edge big data projects.
Key Responsibilities & Requirements- 7-10 years of experience as a Big Data Engineer with expertise in Hadoop (Cloudera), Apache Spark, and similar frameworks.
- Proficient in Big Data querying tools: Pig, Hive, Impala, with strong coding skills in Python, Java, C, Linux, Ruby, PHP, or R.
- Skill in solving complex networking, data, and software challenges.
- Capable of planning, organizing, and delivering projects effectively, with excellent team player skills and strong interpersonal communication skills.
You will enjoy the flexibility of a fully remote role and gain exposure to leading-edge big data projects.
Big Data Engineering Leadership Role
Posted today
Job Viewed
Job Description
Big Data Engineering Leadership Role
As a senior big data engineer, you will be responsible for leading the development of complex distributed systems and big data technologies.
- Design, develop, and deploy scalable big data architectures using Hadoop and Spark.
- Lead a team of engineers to design, implement, and maintain big data solutions across multiple industries.
- Develop and maintain knowledge in big data querying tools such as Pig, Hive, and Impala.
- Collaborate with cross-functional teams to design and implement big data analytics solutions.
Required Skills and Qualifications
- 10+ years of experience as a big data engineer.
- Expertise in Hadoop (Cloudera), Spark, and similar frameworks.
- Good knowledge of big data querying tools such as Pig, Hive, and Impala.
- Knowledge of scripting languages including Java, C++, Linux, Ruby, PHP, Python, and R.
- Excellent leadership and communication skills.
- Ability to solve complex networking, data, and software issues.
- Able to effectively plan and organize their work.
- Strong interpersonal communication.
- Assist others in the completion of their tasks to support group goals.
Benefits
This role offers a challenging opportunity for a seasoned big data engineer to grow professionally and contribute to the success of our organization. You will have the chance to work on high-profile projects, collaborate with top talent, and receive competitive compensation and benefits.
Others
Our organization is committed to diversity, equity, and inclusion. We welcome applications from individuals who share our values and are passionate about making a difference in the field of big data engineering.
Senior Big Data Engineer
Posted today
Job Viewed
Job Description
Are you an experienced data professional seeking a challenging role in a dynamic environment? We have an exciting opportunity for a skilled Data Engineer to join our team. As a key member of our data engineering group, you will be responsible for designing, developing, and maintaining large-scale data pipelines that meet the organization's business needs.
Key Responsibilities:- Design and develop highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform
- Implement and manage data ingestion processes from various sources to the data lake or data warehouse
- Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs
- Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes
- Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field
- 3+ years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform
- Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques
- Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase
- A competitive salary and benefits package
- Opportunities for career growth and professional development
- A collaborative and dynamic work environment
Be The First To Know
About the latest Big data Jobs in United Arab Emirates !
Senior Big Data Analyst
Posted today
Job Viewed
Job Description
Big Data Developer Job Description:
The role of a Big Data Developer involves working with large-scale data processing and analysis. This requires expertise in various big data technologies such as Apache Spark, Hadoop, Hive, HBase, Pig, and cloud-based platforms like Databricks.
Required Skills and Qualifications:
- Programming Skills: Strong experience in Java and Scala programming languages is required. Knowledge of other languages like Python is also beneficial.
- Database Experience: Familiarity with databases like HDFS, Data Lake, PostgreSQL, MS SQL, and similar databases is necessary.
- Cloud Platforms: Experience with cloud-based big data platforms like Databricks, Oracle DRCC, and other related technologies is essential.
- Data Security and Compliance: Understanding of data security measures like PII data handling, data masking, encryption, decryption, and related security protocols is crucial.
- Workflow Orchestration: Hands-on experience with workflow orchestration tools like Apache Airflow or Nifi is necessary.
Benefits:
- Opportunity to work on challenging projects involving big data processing and analysis.
- Chance to collaborate with a team of experienced professionals in the field.
- Professional growth and development opportunities.
Job Function and Industry:
- Engineering and Information Technology.
- IT Services and IT Consulting.
Employment Type:
- Contract.
Seniority Level:
- Mid-Senior level.
Senior Big Data Engineer
Posted today
Job Viewed
Job Description
**Data Engineering Role Summary**
- Achieve business value by designing, constructing, and managing data pipelines for ingestion, transformation, storage, and analytics.
Our ideal candidate will have a deep understanding of data architecture, processing, and governance principles to ensure the delivery of high-quality data products. Key responsibilities include:
- Designing and developing scalable data architectures using Azure Databricks, Data Lake Storage, and other cloud-based technologies.
- Ensuring data quality, integrity, security, and compliance throughout the data lifecycle.
- Collaborating with cross-functional teams to deliver end-to-end data solutions that meet business needs.
- Minimum 5 years of experience in a data engineering or data platform role, preferably in a cloud computing environment.
- Strong expertise in SQL, Python, PySpark, Jupyter Notebooks, and related data processing frameworks.
- Experience working with ETL/ELT tools such as Azure Data Factory, Azure Synapse, or similar platforms.
- Excellent problem-solving skills, with the ability to analyze complex technical issues and implement effective solutions.
- Strong communication and collaboration skills to work effectively with both technical and non-technical stakeholders.
- Cloud Computing (Azure)
- Big Data Technologies (Databricks, Data Lake Storage)
- Programming Languages (Python, SQL)
- ETL/ELT Tools (Data Factory, Synapse)
Senior Big Data Specialist
Posted today
Job Viewed
Job Description
We are seeking an experienced Data Engineer to join our team. The successful candidate will design, develop and deploy scalable data pipelines using various tools and technologies such as RDBMS, NoSQL DBs, log files and events.
The role involves working with product teams to improve products and meet user needs, participating in sprint planning and ensuring projects are deployable and monitorable from outside.
In addition, the candidate should have experience with Apache Nifi for designing workflows and managing data pipelines, proficiency in working with Kafka for messaging and streaming use cases, and strong understanding of Spark for batch processing and Flink for real-time stream processing.
The ideal candidate should also have familiarity with Debezium for change data capture (CDC) and real-time data synchronization, good understanding of data architecture principles, experience of big data environments and advising best practices/new technologies to Analytics team.
Experience of storing data in systems such as S3, Kafka, and knowledge of basic principles of distributed computing and data modeling is required. Scripting or programming skills and indepth knowledge of Spark, Kafka, Airflow, Apache Avro, Parquet, ORC is essential. Good understanding of OLAP data models design, experience working in an agile environment, knowledge of version control systems such as git, and experience in large-scale analytics, insight, decision making, and reporting solutions based on big data technology is desirable.