What Jobs are available for Cloud Data Engineering in Dubai?

Showing 52 Cloud Data Engineering jobs in Dubai

Senior Data Management Consultant, Customer Experience

Dubai, Dubai Informatica Corp.

Posted today

Job Viewed

Tap Again To Close

Job Description

A leading technology company is seeking a Principal Consultant in Customer eXperience to provide expert data management consulting services. The role involves working closely with customers to deliver solutions on Data Governance and Integration using Informatica products. Candidates should have over 10 years of experience in the industry, be fluent in English and Arabic, and possess strong technical leadership skills. The position offers opportunities for mentorship and professional growth. #J-18808-Ljbffr
Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Head of Data Engineering

Dubai, Dubai M2

Posted today

Job Viewed

Tap Again To Close

Job Description

As the Head of Data Engineering, you will be responsible for designing, implementing, and maintaining a robust, scalable, and compliant data architecture that supports our exchange’s operations, analytics, and regulatory reporting requirements. You will lead a team of data engineers, ensuring high availability, security, and performance of our data infrastructure. You will work closely with stakeholders across technology, compliance, risk, and product teams to develop data pipelines, warehouses, and real-time analytics capabilities.

This is a strategic yet hands-on role where you will drive data engineering best practices, scalability, and automation while ensuring compliance with regulatory data requirements for a financial services entity.

Key Responsibilities

Data Architecture & Strategy :

  1. Define and own the data architecture strategy for the exchange, ensuring it is scalable, secure, and regulatory-compliant.
  2. Design and implement data modeling, governance, and security frameworks to meet financial and regulatory requirements.
  3. Architect real-time and batch processing data pipelines to handle trade data, order books, user activity, and market analytics.
  4. Optimize data storage and retrieval for performance and cost efficiency, leveraging cloud-based and hybrid solutions.
  5. Build and maintain ETL / ELT data pipelines for operational, analytical, and compliance-related data.
  6. Ensure high data quality, reliability, and availability through robust monitoring, alerting, and data validation techniques.
  7. Manage and enhance data warehouses, data lakes, and streaming platforms to support business intelligence and machine learning use cases.
  8. Oversee database design and optimization for transactional and analytical workloads (e.g., Aurora, Redis, Kafka).
  9. Implement data lineage, metadata management, and access control mechanisms in line with compliance requirements.

Compliance, Security & Risk Management :

  1. Work closely with compliance and risk teams to ensure data retention policies, audit trails, and reporting mechanisms meet regulatory requirements (e.g., FATF, AML, GDPR, MiCA).
  2. Implement encryption, anonymization, and access control policies to safeguard sensitive user and transaction data.
  3. Support fraud detection and risk monitoring through data analytics and alerting frameworks.

Leadership & Team Management :

  1. Lead, mentor, and grow a small team of data engineers, fostering a culture of collaboration, innovation, and accountability.
  2. Drive best practices in data engineering, DevOps for data, and CI / CD automation for analytics infrastructure.
  3. Collaborate with software engineers, DevOps, data analysis, and product teams to integrate data solutions into the broader exchange ecosystem.

Technical competencies and skills :

  1. Proven experience in data architecture and engineering, preferably within a regulated financial or crypto environment.
  2. Strong proficiency in SQL, Python, or Scala for data engineering.
  3. Experience with cloud-based data platforms (AWS, GCP, or Azure) and orchestration tools (Airflow, Prefect, Dagster).
  4. Hands-on experience with real-time data processing (Kafka, Pulsar, Flink, Spark Streaming).
  5. Expertise in data warehousing solutions (Snowflake, BigQuery, Redshift, Databricks).
  6. Strong understanding of database design, indexing strategies, and query optimization.
  7. Experience implementing data governance, lineage, and cataloging tools.
  8. Familiarity with blockchain / crypto data structures and APIs is a plus.

Leadership & Strategic Skills :

  1. Experience leading and mentoring a team of data engineers.
  2. Ability to design data strategies that align with business goals and regulatory requirements.
  3. Strong cross-functional collaboration skills with compliance, risk, and technology teams.
  4. Ability to work in a fast-paced, high-growth startup environment with a hands-on approach.

Industry & Compliance Knowledge :

  1. Experience in regulated financial markets, fintech, or crypto is highly preferred.
  2. Familiarity with financial data standards, KYC / AML reporting, and regulatory requirements related to data handling.

Preferred Qualifications :

  1. Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.
  2. Certifications in cloud data engineering (AWS / GCP / Azure), data governance, or security are a plus.
  3. Experience working in a crypto exchange, trading platform, or high-frequency trading environment is an advantage.

At M2, we believe in a workplace where talent, dedication, and passion are the only factors that count, regardless of gender, background, age, and other characteristics. We embrace diversity because we know that it fuels innovation, fosters creativity, and drives success. So, if you're ready to join a team where your potential is truly valued, welcome aboard!

#J-18808-Ljbffr
Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

OCI Cloud Engineer

Dubai, Dubai Everythinginclick

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled and experienced OCI Cloud Engineer to join our team in Dubai, UAE. In this role, you will be responsible for designing, implementing, and managing cloud-based solutions using Oracle Cloud Infrastructure (OCI). The ideal candidate will have a strong background in cloud technologies and a passion for leveraging cloud services to drive innovation and efficiency.

Key Responsibilities of OCI Cloud Engineer
  1. At least 3+ years of experience working with Oracle Cloud Infrastructure (OCI), including core services like Compute, Storage, Networking, and IAM.
  2. Strong understanding of OCI networking components, including VCNs, Security Lists, NSGs, Load Balancers, and Gateways.
  3. Advanced knowledge of Terraform, including writing complex reusable modules, managing state files, and troubleshooting IaC deployments.
  4. Experience integrating Terraform with CI/CD pipelines for automated deployments.
  5. Hands-on experience creating and managing Ansible playbooks, roles, and inventories.
  6. Proficiency in using Ansible for configuration management, patching, and software deployment in cloud environments.
  7. Deep understanding of TCP/IP, DNS, BGP, VPNs, and hybrid connectivity solutions such as FastConnect.
  8. Knowledge of cloud security best practices, including IAM policies, encryption, and vulnerability management.
  9. Experience in scripting with Python, Bash, or PowerShell for custom automation tasks.
Qualification Required for OCI Cloud Engineer
  1. Bachelor's Degree in Computer Science, Information Technology, or a related field.
  2. Minimum of 3 years of experience working with cloud technologies, preferably OCI.
  3. Strong understanding of cloud architecture, security, and best practices.
  4. Proficiency in scripting and automation tools.
  5. Excellent problem-solving and analytical skills.
  6. Strong communication and interpersonal skills.
  7. Ability to work effectively in a fast-paced and dynamic environment.
#J-18808-Ljbffr
Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Bayut | dubizzle

Posted today

Job Viewed

Tap Again To Close

Job Description

Property Monitor is the UAE’s leading real estate technology and market intelligence platform, recently acquired by Dubizzle Group. At Property Monitor, we empower developers, brokers, investors, and property professionals with authoritative data and powerful analytics, enabling them to make faster, smarter, and more informed decisions.

As part of Dubizzle Group, we are alongside five powerhouse brands - including market-leading platforms like Bayut and dubizzle trusted by over 123 million monthly users. Together, these brands shape how people buy, sell, and connect across real estate, classifieds, and services in the UAE and broader region.

The Data Engineer will help deliver world-class big data solutions and drive impact for the dubizzle business. You will be responsible for exciting projects covering the end-to-end data life cycle – from raw data integrations with primary and third-party systems, through advanced data modeling, to state-of-the-art data visualization and development of innovative data products.

You will have the opportunity to build and work with both batch and real-time data processing pipelines. While working in a modern cloud-based data warehousing environment alongside a team of diverse, intense and interesting co-workers, you will liaise with other teams– such as product & tech, the core business verticals, trust & safety, finance and others – to enable them to be successful.

In this role, you will:

  • Raw data integrations with primary and third-party systems
  • Data warehouse modelling for operational and application data layers
  • Development in Amazon Redshift cluster
  • SQL development as part of agile team workflow
  • ETL design and implementation in Matillion ETL
  • Real-time data pipelines and applications using serverless and managed AWS services such as Lambda, Kinesis, API Gateway, etc.
  • Design and implementation of data products enabling data-driven features or business solutions
  • Data quality, system stability and security
  • Coding standards in SQL, Python, ETL design
  • Building data dashboards and advanced visualisations in Periscope Data with a focus on UX, simplicity and usability
  • Working with other departments on data products – i.e. product & technology, marketing & growth, finance, core business, advertising and others
  • Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data
  • Be able to work autonomously without supervision on complex projects
  • Participate in the early morning ETL status check rota

Requirements:

  • Top of class technical degree such as computer science, engineering, math, physics.
  • 3+ years of experience working with customer-centric data at big data-scale, preferably in an online / e-commerce context
  • 2+ years of experience with one or more programming languages, especially Python
  • Strong track record in business intelligence solutions, building and scaling data warehouses and data modelling
  • Experience with modern big data ETL tools is a plus (e.g. Matillion)
  • Experience with AWS data ecosystem (or other cloud providers)
  • Experience with modern data visualization platforms such as Sisense (formerly Periscope Data), Google Data Studio, Tableau, MS Power BI etc.
  • Knowledge of modern real-time data pipelines is a strong plus (e.g. server less framework, lambda, kinesis, etc.)
  • Knowledge or relational relational and dimensional data models
  • Knowledge of terminal operations and Linux workflows
  • World-class SQL skills across a variety of relational data warehousing technologies especially in cloud data warehousing (e.g. Amazon Redshift, Google BigQuery, Snowflake, Vertica, etc.)
  • Ability to communicate insights and findings to a non-technical audience

What We Offer:

  • A fast paced, high performing team.
  • Multicultural environment with over 50 different nationalities
  • Competitive Tax-free Salary
  • Comprehensive Health Insurance
  • Annual Air Ticket Allowance
  • Employee discounts at multiple vendors across the emirates
  • Rewards & Recognitions
  • Learning & Development

Dubizzle Group is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

#J-18808-Ljbffr

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Dubizzle Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Property Monitor is the UAE’s leading real estate technology and market intelligence platform, recently acquired by Dubizzle Group. At Property Monitor, we empower developers, brokers, investors, and property professionals with authoritative data and powerful analytics, enabling them to make faster, smarter, and more informed decisions.

As part of Dubizzle Group, we are alongside five powerhouse brands - including market-leading platforms like Bayut and dubizzle trusted by over 123 million monthly users. Together, these brands shape how people buy, sell, and connect across real estate, classifieds, and services in the UAE and broader region.

The Data Engineer will help deliver world-class big data solutions and drive impact for the dubizzle business. You will be responsible for exciting projects covering the end-to-end data life cycle – from raw data integrations with primary and third-party systems, through advanced data modeling, to state-of-the-art data visualization and development of innovative data products.

You will have the opportunity to build and work with both batch and real-time data processing pipelines. While working in a modern cloud-based data warehousing environment alongside a team of diverse, intense and interesting co-workers, you will liaise with other teams– such as product & tech, the core business verticals, trust & safety, finance and others – to enable them to be successful.

In this role, you will:

  • Raw data integrations with primary and third-party systems
  • Data warehouse modelling for operational and application data layers
  • Development in Amazon Redshift cluster
  • SQL development as part of agile team workflow
  • ETL design and implementation in Matillion ETL
  • Real-time data pipelines and applications using serverless and managed AWS services such as Lambda, Kinesis, API Gateway, etc.
  • Design and implementation of data products enabling data-driven features or business solutions
  • Data quality, system stability and security
  • Coding standards in SQL, Python, ETL design
  • Building data dashboards and advanced visualisations in Periscope Data with a focus on UX, simplicity and usability
  • Working with other departments on data products – i.e. product & technology, marketing & growth, finance, core business, advertising and others
  • Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data
  • Be able to work autonomously without supervision on complex projects
  • Participate in the early morning ETL status check rota

Requirements:

  • Top of class technical degree such as computer science, engineering, math, physics.
  • 3+ years of experience working with customer-centric data at big data-scale, preferably in an online / e-commerce context
  • 2+ years of experience with one or more programming languages, especially Python
  • Strong track record in business intelligence solutions, building and scaling data warehouses and data modelling
  • Experience with modern big data ETL tools is a plus (e.g. Matillion)
  • Experience with AWS data ecosystem (or other cloud providers)
  • Experience with modern data visualization platforms such as Sisense (formerly Periscope Data), Google Data Studio, Tableau, MS Power BI etc.
  • Knowledge of modern real-time data pipelines is a strong plus (e.g. server less framework, lambda, kinesis, etc.)
  • Knowledge or relational relational and dimensional data models
  • Knowledge of terminal operations and Linux workflows
  • World-class SQL skills across a variety of relational data warehousing technologies especially in cloud data warehousing (e.g. Amazon Redshift, Google BigQuery, Snowflake, Vertica, etc.)
  • Ability to communicate insights and findings to a non-technical audience

What We Offer:

  • A fast paced, high performing team.
  • Multicultural environment with over 50 different nationalities
  • Competitive Tax-free Salary
  • Comprehensive Health Insurance
  • Annual Air Ticket Allowance
  • Employee discounts at multiple vendors across the emirates
  • Rewards & Recognitions
  • Learning & Development

Dubizzle Group is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

#J-18808-Ljbffr

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Theintechgroup

Posted today

Job Viewed

Tap Again To Close

Job Description

8. Education / Qualifications / Professional Training

Bachelor’s Degree in Computer Science or Management with 4+ years of experience in Vertica and equivalent Databases

  • Vertica Certification is a plus.
  • Experience with data visualization tools (e.g., Power BI, SAP BI, SAS, Tableau) for data reporting and dashboard creation is beneficial.
8.2 Work Experience

Above 4 yrs exp required in Vertica Database Functionalities

8.3 Technical Competencies
  1. Bachelor’s degree in Computer Science, Information Technology, or a related field.
  2. Proven experience as a Data Engineer or similar role with hands-on expertise in designing and managing data solutions in Vertica.
  3. Strong proficiency in SQL and experience with data modeling and schema design in Vertica.
  4. In-depth knowledge of ETL processes and tools, particularly for data integration into Vertica.
  5. Familiarity with other big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is advantageous.
  6. Understanding of data warehousing concepts and best practices.
  7. Experience in performance tuning and optimization of Vertica databases.
  8. Familiarity with Linux environments and shell scripting for data-related automation tasks is a plus.
  9. Excellent problem-solving skills and the ability to handle large datasets effectively.
  10. Strong communication and collaboration skills to work effectively within a team-oriented environment.
  11. Self-motivated, with the ability to work independently and manage multiple tasks and projects simultaneously.
#J-18808-Ljbffr

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Dubai, Dubai Teliolabs Communication Private Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Location : Dubai

Who Can Apply: Candidates who are currently in Dubai
Job Type: Contract
Experience: Minimum 8+ years

Job Summary:

We are looking for an experienced Data Engineer to design, develop, and optimize data pipelines, ETL processes, and data integration solutions. The ideal candidate should have expertise in AWS cloud services, data engineering best practices, open-source tools, and data schema design. The role requires hands-on experience with large-scale data processing, real-time data streaming, and cloud-based data architectures.

Key Responsibilities:

  • Develop and Maintain Data Pipelines to process structured and unstructured data efficiently.
  • Implement ETL/ELT Workflows for batch and real-time data processing.
  • Optimize Data Processing Workflows using distributed computing frameworks.
  • Ensure Data Integrity and Quality through data validation, cleaning, and transformation techniques.
  • Work with AWS Cloud Services , including S3, Redshift, Glue, Lambda, DynamoDB, and Kinesis.
  • Leverage Open-Source Tools like Apache Spark, Airflow, Kafka, and Flink for data processing.
  • Manage and Optimize Database Performance for both SQL and NoSQL environments.
  • Collaborate with Data Scientists and Analysts to enable AI/ML model deployment and data accessibility.
  • Support Data Migration Initiatives from on-premise to cloud-based data platforms.
  • Ensure Compliance and Security Standards in handling sensitive and regulated data.
  • Develop Data Models and Schemas for efficient storage and retrieval.

Required Skills & Qualifications:

  • 8+ years of experience in data engineering, data architecture, and cloud computing.
  • Strong knowledge of AWS Services such as Glue, Redshift, Athena, Lambda, and S3.
  • Expertise in ETL Tools , including Talend, Apache NiFi, Informatica, dbt, and AWS Glue.
  • Proficiency in Open-Source Tools such as Apache Spark, Hadoop, Airflow, Kafka, and Flink.
  • Strong Programming Skills in Python, SQL, and Scala.
  • Experience in Data Schema Design , normalization, and performance optimization.
  • Knowledge of Real-time Data Streaming using Kafka, Kinesis, or Apache Flink.
  • Experience in Data Warehouse and Data Lake Solutions .
  • Hands-on experience with DevOps and CI/CD Pipelines for data engineering workflows.
  • Understanding of AI and Machine Learning Data Pipelines .
  • Strong analytical and problem-solving skills .

Preferred Qualifications:

  • AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification.
  • Experience with Kubernetes, Docker, and serverless data processing.
  • Exposure to MLOps and data engineering practices for AI/ML solutions.
  • Experience with distributed computing and big data frameworks.
#J-18808-Ljbffr
Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Cloud data engineering Jobs in Dubai !

Data Engineer

Dubai, Dubai Everythinginclick

Posted today

Job Viewed

Tap Again To Close

Job Description

The Data Engineer will be responsible for developing semantic models on top of the Data Lake/Data Warehouse to fulfill the self-service BI foundation requirements. This includes data extraction from various data sources and integration into the central data lake/data warehouse using enterprise platforms like Informatica iPaaS.

Key Responsibilities of Data Engineer
  1. Designing data warehouse data models based on business requirements.
  2. Designing, developing, and testing both batch and real-time Extract, Transform and Load (ETL) processes required for data integration.
  3. Ingesting both structured and unstructured data into the SMBU data lake/data warehouse system.
  4. Designing and developing semantic models/self-service cubes.
  5. Performing BI administration and access management to ensure access and reports are properly governed.
  6. Performing unit testing and data validation to ensure business UAT is successful.
  7. Performing ad-hoc data analysis and presenting results in a clear manner.
  8. Assessing data quality of the source systems and proposing enhancements to achieve a satisfactory level of data accuracy.
  9. Optimizing ETL processes to ensure execution time meets requirements.
  10. Maintaining and architecting ETL pipelines to ensure data is loaded on time on a regular basis.
Qualification Required for Data Engineer
  1. 5 to 8 years of overall experience.
  2. Proven experience in the development of dimensional models in Azure Synapse with strong SQL knowledge.
  3. Minimum of 3 years working as a Data Engineer in the Azure ecosystem specifically using Synapse, ADF & Data bricks.
  4. Preferably 3 years of experience with data warehousing, ETL development, SQL Queries, Synapse, ADF, PySpark, and Informatica iPaaS for data ingestion & data modeling.
#J-18808-Ljbffr
Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Data Engineer - Intern

Dubai, Dubai Bayut | dubizzle

Posted today

Job Viewed

Tap Again To Close

Job Description

Bayut & dubizzle have the unique distinction of being iconic, homegrown brands with a strong presence across the seven emirates in the UAE. Connecting millions of users across the country, we are committed to delivering the best online search experience.

As part of Dubizzle Group, we are alongside some of the strongest classified brands in the market. With a collective strength of 8 brands, we have more than 160 million monthly users that trust in our dedication to providing them with the best platform for their needs.

The Data Engineer intern will be participating in exciting projects covering the end-to-end data lifecycle – from raw data integrations with primary and third-party systems, through advanced data modelling, to state-of-the-art data visualisation and development of innovative data products.

You will have the opportunity to learn how to build and work with both batch and real-time data processing pipelines. You will work in a modern cloud-based data warehousing environment alongside a team of diverse, intense and interesting co-workers. You will liaise with other departments – such as product & tech, the core business verticals, trust & safety, finance and others – to enable them to be successful.

Key Responsibilities Include:

  1. Raw data integrations with primary and third-party systems
  2. Data warehouse modelling for operational and application data layers
  3. Development in Amazon Redshift cluster
  4. SQL development as part of agile team workflow
  5. ETL design and implementation in Matillion ETL
  6. Design and implementation of data products enabling data-driven features or business solutions
  7. Data quality, system stability and security
  8. Coding standards in SQL, Python, ETL design
  9. Building data dashboards and advanced visualisations in Periscope Data with a focus on UX, simplicity and usability
  10. Working with other departments on data products – i.e. product & technology, marketing & growth, finance, core business, advertising and others
  11. Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data

Minimum Requirements:

  • Bachelor’s degree in computer science, engineering, math, physics or any related quantitative field.
  • Knowledge of relational and dimensional data models
  • Knowledge of terminal operations and Linux workflows
  • Ability to communicate insights and findings to a non-technical audience
  • Good SQL skills across a variety of relational data warehousing technologies especially in cloud data warehousing (e.g. Amazon Redshift, Google BigQuery, Snowflake, Vertica, etc.)
  • Attention to details and analytical thinking
  • Entrepreneurial spirit and ability to think creatively; highly-driven and self-motivated; strong curiosity and strive for continuous learning
  • Ability to contribute to a platform used by more than 5M users in UAE and other platforms in the region.

Bayut & dubizzle is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

#J-18808-Ljbffr
Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Vertica Data Engineer

Dubai, Dubai micro1

Posted today

Job Viewed

Tap Again To Close

Job Description

Overview

Job Title: Vertica Data Engineer

Job Type: Full-time

Location: On-site Dubai, Dubai, United Arab Emirates

As a Vertica Data Engineer, you will be responsible for architecting, developing, and maintaining our Vertica database systems and data pipelines. You will work closely with cross-functional teams, including data scientists, analysts, and developers, to ensure seamless data integration, transformation, and retrieval from Vertica. Your expertise in Vertica's architecture, ETL processes, data modeling, and performance tuning will be crucial in delivering robust and scalable data solutions to support our business objectives. Responsible to migrate data from Oracle, SQL Server to VERTICA Database on timely manner.

Principal Responsibilities
  • Design and implement efficient data pipelines to extract, transform, and load (ETL) data from various sources into Vertica using popular ETL tools like SAP BODS and Informatica.
  • Develop and maintain data models and database structures optimized for high-performance data retrieval and analysis in Vertica.
  • Experience in migrating data from Oracle/SQL Server/SQ Lite to Vertica database.
  • Write program units (functions/procedures) to prepare data based on business requirements.
  • Collaborate with data scientists and analysts to understand their data needs and provide scalable solutions to meet analytical requirements.
  • Implement data quality checks and data validation processes to ensure the accuracy and integrity of data stored in Vertica.
  • Experience in VerticaPy to work with machine learning libraries for data science use cases.
  • Work with the DevOps team to ensure smooth deployment, monitoring, and maintenance of Vertica instances.
  • Optimize database performance and query execution through indexing, partitioning, and other performance tuning techniques.
  • Monitor Vertica database health, troubleshoot performance issues, and proactively address potential bottlenecks.
  • Collaborate with data engineers and other stakeholders to define best practices and data governance standards for Vertica usage.
  • Stay up-to-date with the latest advancements in Vertica and big data technologies and recommend improvements to existing solutions.
  • Document technical specifications, data flows, and architectural designs related to Vertica implementations.
  • Provide training and support to end-users to enable them to navigate and interact with Power BI reports effectively.
Job Dimensions
  1. Bachelor's degree in Computer Science, Information Technology, or a related field.
  2. Proven experience as a Data Engineer or similar role with hands-on expertise in designing and managing data solutions in Vertica.
  3. Strong proficiency in SQL and experience with data modeling and schema design in Vertica.
  4. In-depth knowledge of ETL processes and tools, particularly for data integration into Vertica.
  5. Familiarity with other big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is advantageous.
  6. Understanding of data warehousing concepts and best practices.
  7. Experience in performance tuning and optimization of Vertica databases.
  8. Familiarity with Linux environments and shell scripting for data-related automation tasks is a plus.
  9. Excellent problem-solving skills and the ability to handle large datasets effectively.
  10. Strong communication and collaboration skills to work effectively within a team-oriented environment.
  11. Self-motivated, with the ability to work independently and manage multiple tasks and projects simultaneously.
#J-18808-Ljbffr

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Cloud Data Engineering Jobs View All Jobs in Dubai