428 Data Integration Specialist Roles jobs in the United Arab Emirates
ETL Developer
Posted today
Job Viewed
Job Description
We are currently looking for ETL Developer - Banking for our UAE operations with the following Skill set and terms & conditions.
Technical Skill Sets :
Data models with good knowhow of various data modelling techniques like star schema snow flake schema dimension modelling etc
Expert in Database stored procedures and Structured Query Language PLSQL Java
ETL Tools Informatica Microsoft SSIS
Business intelligence tools SAP Business objects Microsoft Power BI
Core banking systems (Preferred) Finacle HPS Power Card
Soft Skills
Strong analytical and problem-solving abilities.
Excellent communication skills in English both written and verbal.
Ability to work collaboratively in a team environment.
Preferred Qualifications
Understanding of the UAE Banking Regulatory Framework Submissions (BRF).
Familiarity with the CBUAEs SupTech initiative and its objectives.
Experience in automating regulatory reporting processes.
Key Responsibilities
Automation Development : Design and implement automation solutions for BRF submissions focusing on data extraction transformation and loading (ETL) processes.
Database Management : Develop and optimize SQL/PLSQL/JAVA/HTML scripts to interface with our core banking system FinCore and its various modules ensuring accurate and timely data retrieval.
Collaboration : Work closely with the Business Intelligence Manager/Unit to translate functional requirements into technical specifications.
Reporting Tools Integration : Assist in integrating and configuring reporting tools (e.g. Business Objects or other platforms) to streamline report generation.
Compliance Alignment : Ensure all automation processes comply with CBUAEs SupTech standards and data governance policies.
Documentation : Maintain comprehensive documentation of developed solutions including technical specifications and user guides.
Support & Maintenance : Provide ongoing support for deployed solutions addressing any issues and implementing enhancements as needed
Joining time frame: 2 weeks (maximum 1 month
Additional Information :
Terms and conditions:
Joining time frame: maximum 4 weeks
Remote Work :
No
Employment Type :
Full-time
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
Location : Dubai
Who Can Apply: Candidates who are currently in Dubai
Job Type: Contract
Experience: Minimum 8+ years
Job Summary:
We are looking for an experienced Data Engineer to design, develop, and optimize data pipelines, ETL processes, and data integration solutions. The ideal candidate should have expertise in AWS cloud services, data engineering best practices, open-source tools, and data schema design. The role requires hands-on experience with large-scale data processing, real-time data streaming, and cloud-based data architectures.
Key Responsibilities:
- Develop and Maintain Data Pipelines to process structured and unstructured data efficiently.
- Implement ETL/ELT Workflows for batch and real-time data processing.
- Optimize Data Processing Workflows using distributed computing frameworks.
- Ensure Data Integrity and Quality through data validation, cleaning, and transformation techniques.
- Work with AWS Cloud Services , including S3, Redshift, Glue, Lambda, DynamoDB, and Kinesis.
- Leverage Open-Source Tools like Apache Spark, Airflow, Kafka, and Flink for data processing.
- Manage and Optimize Database Performance for both SQL and NoSQL environments.
- Collaborate with Data Scientists and Analysts to enable AI/ML model deployment and data accessibility.
- Support Data Migration Initiatives from on-premise to cloud-based data platforms.
- Ensure Compliance and Security Standards in handling sensitive and regulated data.
- Develop Data Models and Schemas for efficient storage and retrieval.
Required Skills & Qualifications:
- 8+ years of experience in data engineering, data architecture, and cloud computing.
- Strong knowledge of AWS Services such as Glue, Redshift, Athena, Lambda, and S3.
- Expertise in ETL Tools , including Talend, Apache NiFi, Informatica, dbt, and AWS Glue.
- Proficiency in Open-Source Tools such as Apache Spark, Hadoop, Airflow, Kafka, and Flink.
- Strong Programming Skills in Python, SQL, and Scala.
- Experience in Data Schema Design , normalization, and performance optimization.
- Knowledge of Real-time Data Streaming using Kafka, Kinesis, or Apache Flink.
- Experience in Data Warehouse and Data Lake Solutions .
- Hands-on experience with DevOps and CI/CD Pipelines for data engineering workflows.
- Understanding of AI and Machine Learning Data Pipelines .
- Strong analytical and problem-solving skills .
Preferred Qualifications:
- AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification.
- Experience with Kubernetes, Docker, and serverless data processing.
- Exposure to MLOps and data engineering practices for AI/ML solutions.
- Experience with distributed computing and big data frameworks.
Data Engineer
Posted today
Job Viewed
Job Description
The Data Engineer will be responsible for developing semantic models on top of the Data Lake/Data Warehouse to fulfill the self-service BI foundation requirements. This includes data extraction from various data sources and integration into the central data lake/data warehouse using enterprise platforms like Informatica iPaaS.
Key Responsibilities of Data Engineer- Designing data warehouse data models based on business requirements.
- Designing, developing, and testing both batch and real-time Extract, Transform and Load (ETL) processes required for data integration.
- Ingesting both structured and unstructured data into the SMBU data lake/data warehouse system.
- Designing and developing semantic models/self-service cubes.
- Performing BI administration and access management to ensure access and reports are properly governed.
- Performing unit testing and data validation to ensure business UAT is successful.
- Performing ad-hoc data analysis and presenting results in a clear manner.
- Assessing data quality of the source systems and proposing enhancements to achieve a satisfactory level of data accuracy.
- Optimizing ETL processes to ensure execution time meets requirements.
- Maintaining and architecting ETL pipelines to ensure data is loaded on time on a regular basis.
- 5 to 8 years of overall experience.
- Proven experience in the development of dimensional models in Azure Synapse with strong SQL knowledge.
- Minimum of 3 years working as a Data Engineer in the Azure ecosystem specifically using Synapse, ADF & Data bricks.
- Preferably 3 years of experience with data warehousing, ETL development, SQL Queries, Synapse, ADF, PySpark, and Informatica iPaaS for data ingestion & data modeling.
Data Engineer
Posted today
Job Viewed
Job Description
Dubai, United Arab Emirates | Posted on 07/29/2025
myZoiis changing lives for the better for those who deserve it the most. We are an excitingfintech start-up aiming to promote financial inclusion globally. Our vision isto provide a level playing field to the unbanked and the underbanked inaccessing essential financial services in an affordable, convenient, andtransparent fashion. We are looking for smart, ambitious, and purpose-drivenindividuals to join us in this journey.Please apply via the link below ifyou are interested.
You will beworking in our Data Platform team, providing data capability for internal andproduct requirements for myZoi. You will be proactive and innovative and youwill be using 100% cloud technologies based on AWS and modern Open Sourcetooling to provide a real-time data infrastructure, allowing our teams to gainunprecedented insight into our wealth of application data. You will work with aworld-class team of Data Analysts and Engineers to provide best in classsolutions.
ArchitectAWS-Centric Data Solutions:
- Designand optimize high-performance data pipelines leveraging AWS native tools.
- Architectmodular, AI-ready data lake with a roadmap to ensure secure ingestion,transformation, and consumption workflows.
- Implementscalable streaming solutions that factor in performance, scalability and cost.
EmbedSecurity & Compliance Across AWS Workloads
- Buildand enforce data governance protocols aligned with relevant regulatory andcompliance requirements using AWS tools.
- Collaboratewith cybersecurity teams to implement IAM best practices, encryptionstrategies, and secure networking.
- Maintaintraceability and auditability for all data flows across the AWS stack.
Optimizefor Observability & Cost Efficiency:
- Workwith our Cloud Architect and SRE to deploy and fine-tune monitoring dashboardsusing Datadog and AWS CloudWatch for performance, anomaly detection, andsecurity event correlation.
- Continuouslyevaluate storage and compute cost optimization across S3, EC2, Redshift, andGlue workloads.
LeadThrough Influence and Collaboration:
- Partnerwith Data Science, Cloud Architect, Security and Engineering leads to aligncloud architecture with evolving business goals and priorities to ensurefuture-readiness.
- Mentorjunior engineers in AWS best practices, scalable design, and secure codingstandards.
- Leadinnovation across key Product initiatives.
Innovatewith Purpose:
- Evaluateand integrate AWS-compatible orchestration tools like Airflow, Lakeformation,ECS, EKS or Managed Workflows.
- Contributeto middleware and third-party orchestration strategies through secure APIs andevent-driven patterns.
- Designdata products based on requirements that focus on key use cases such as socialimpact related.
- Bachelor’sor Master’s degree in Computer Science, Data Engineering, Software Engineeringor a related field.
- 8–10years of professional experience in data engineering, including 5+ yearsarchitecting on AWS underpinned by data governance. Mastery of AWS cloudservices (S3, Lambda, Glue, Redshift, Kinesis, Lake Formation, Crawler etc.).
- Deepexpertise in building scalable cloud-native solutions and managing secure datainfrastructure ensuring data governance.
- Strongcommand of compliance-driven architecture design and real-time monitoringstrategies.
- Goodunderstanding of compliance frameworks related to data privacy and informationsecurity.
- Excellentcommunication skills and a proven leadership in mentoring and ability to leadcross-functional initiatives.
- Proficiencywith agile tools (Jira).
- CloudInfrastructure & AWS Services: S3, Glue, Lambda, Redshift, Kinesis, IAM,CloudWatch, Lake Formation etc. Strong awareness of AWS security tools.
- DataOrchestration: Experience with Apache Airflow on ECS or AWS Managed Workflows.Familiarity with Step Functions and event-driven orchestration patterns.
- Streaming& ETL Pipelines: Expertise in Kinesis Data Streams and Kafka (AWS-hosted orcompatible). Proficiency in designing and optimizing ETL workflows using AWS.
- Monitoring& Observability: Awareness of or exposure to logs, alerting, monitoring,detection and tuning.
- Security& Governance: Awareness of or exposure to AWS KMS. In addition, buildinggovernance workflows with AWS Config and Lake Formation.
- DataModeling & Optimization: Extensive experience in design of AI-ready datalake with scalable ingestion and query performance.
- ProgrammingLanguages: Advanced coding in Python and SQL. Experience in Java and ETLprocesses is also preferred.
- Youhave strong communication skills, curiosity and are a quick learner.
- Youenjoy a creative fast paced agile world.
- Youenjoy mentoring and teaching other developers to create a world class cohesiveteam.
- Youenjoy making friends and having fun.
AtmyZoi we strive to create a both a product and a team that embraces equality,inclusion, diversity and freedom. We want people who can be themselves andbring their own brand of value to the team. Come and join us!
Data Engineer
Posted today
Job Viewed
Job Description
Do you want to love what you do at work? Do you want to make a difference, an impact, and transform people's lives? Do you want to work with a team that believes in disrupting the normal, boring, and average?
If yes, then this is the job for you. webook.com is Saudi’s #1 event ticketing and experience booking platform in terms of technology, features, agility, and revenue, serving some of the largest mega events in the Kingdom with over 2 billion in sales. webook.com is part of the Supertech Group, which also includes UXBERT Labs, one of the best digital and user experience design agencies in the GCC, along with Kafu Games, the largest esports tournament platform in MENA.
Key Responsibilities:
- Data Integration and ETL Development: Architect and implement robust data integration pipelines to extract, transform, and load data from various sources (e.g., databases, SaaS applications, APIs, and flat files) into a centralized data platform. Design and develop complex ETL processes to ensure data quality, consistency, and reliability. Optimize data transformation workflows for performance and scalability.
- Data Infrastructure and Platform Management: Implement and maintain data ingestion, processing, and storage solutions to support data and analytics needs. Ensure data infrastructure's reliability, security, and availability through monitoring, troubleshooting, and disaster recovery planning.
- Data Governance and Metadata Management: Collaborate with the data governance team to establish policies, standards, and procedures. Develop and maintain metadata management systems for data lineage, provenance, and traceability. Implement data quality control measures and validation processes to ensure data integrity.
Minimum Requirements:
- 5-6 years of experience as a Data Engineer or in a related data-driven role.
- Proficient in designing and implementing data pipelines using tools like Apache Airflow, Airbyte, or cloud-based services.
- Strong experience with data infrastructure such as data lakes, data warehouses, and real-time streaming platforms (e.g., Elastic, Google BigQuery, MongoDB).
- Expertise in data modeling, data quality, and metadata management.
- Proficient in programming languages like Python or Java, and SQL.
- Familiar with cloud platforms (AWS, Google Cloud) and DevOps practices.
- Excellent problem-solving skills and ability to work collaboratively across teams.
- Strong communication skills to translate technical concepts to stakeholders.
Preferred Qualifications:
- Experience with data visualization and BI tools (e.g., Tableau, Qlik).
- Knowledge of machine learning and AI applications in data initiatives.
- Project management experience and leadership in data projects.
Data Engineer
Posted today
Job Viewed
Job Description
About the Role
We are an emerging AI-native product-driven, agile start-up under Abu Dhabi government AND we are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms, pipelines, and ML enablement within a Databricks on Azure environment.
As part of a stream-aligned delivery team, you’ll work closely with Data Scientists, Architects, and Product Managers to build scalable, high-quality data solutions for clients. You'll be empowered by a collaborative environment that values continuous learning, Agile best practices, and technical excellence.
Ideal candidates have strong hands-on experience in Databricks, Python, ADF and are comfortable in fast-paced, client-facing consulting engagements.
Skills and Experience requirements
1. Technical
- Databricks (or similar) e.g. Notebooks (Python, SQL), Delta Lake, job scheduling, clusters, and workspace management, Unity Catalog, access control awareness
- Cloud data engineering – ideally Azure, including storage (e.g., ADLS, S3, ADLS), compute, and secrets management
- Development languages such as Python, SQL, C#, javascript etc. especially data ingestion, cleaning, and transformation
- ETL / ELT – including structured logging, error handling, reprocessing strategies, APIs, flat files, databases, message queues, event streaming, event sourcing etc.
- Automated testing (ideally TDD), pairing/mobbing. Trunk Based Development, Continuous Deployment and Infrastructure-as-Code (Terraform)
- Git and CI/CD for notebooks, data pipelines, and deployments
2. Integration & Data Handling
- Experienced in delivering platforms for clients – including file transfer, APIS (REST etc.), SQL/NoSQL/graph databases, JSON, CSV, XML, Parquet etc
- Data validation and profiling - assess incoming data quality. Cope with schema drift, deduplication, and reconciliation
- Testing and monitoring pipelines: Unit tests for transformations, data checks, and pipeline observability
3. Working Style
- Comfortable leveraging the best of lean, agile and waterfall approaches. Can contribute to planning, estimation, and documentation, but also collaborative daily re-prioritisation
- Able to explain technical decisions to teammates or clients
- Documents decisions and keeps stakeholders informed
- Comfortable seeking support from other teams for Product, Databricks, Data architecture
- Happy to collaborate with Data Science team on complex subsystems
Nice-to-haves
- MLflow or light MLOps experience (for the data science touchpoints)
- Dbt / dagster / airflow or similar transformation tools
- Understanding of security and compliance (esp. around client data)
- Past experience in consulting or client-facing roles
Candidate Requirements
- 5–8 years (minimum 3–4 years hands-on with cloud/data engineering, 1–2 years in Databricks/Azure, and team/project leadership exposure)
- Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, Information Systems, Data Engineering
Job Type: Full-time
BenefitsVisa, Insurance, Yearly Flight Ticket, Bonus scheme, relocation logistics covered
Interviewing process consists of 2 or 3 technical/behavioral interviews
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
- Architect and optimize scalable data storage solutions, including data lakes, warehouses, and NoSQL databases, supporting large-scale analytics.
- Design and maintain efficient data pipelines using technologies such as Apache Spark, Kafka, Fabric Data Factory, and Airflow, based on cross-functional team requirements.
- Develop robust ETL processes for reliable data ingestion, utilizing tools like SSIS, ADF, and custom Python scripts to ensure data quality and streamline workflows.
- Optimize ETL performance through techniques like partitioning and parallel processing.
- Define and implement data models and schemas for structured and semi-structured sources, ensuring consistency and efficiency while collaborating with data teams to optimize performance.
- Establish and enforce data governance policies, ensuring data quality, security, and compliance with regulations, using tools like Microsoft SQL Server.
- Implement access controls, encryption, and auditing to protect sensitive data and collaborate with IT to address vulnerabilities.
- Manage and optimize cloud and on-prem infrastructure for data processing, monitor system performance, and implement disaster recovery enhancements.
- Leverage automation for provisioning, configuration, and deployment to improve operational efficiency.
- Provide technical leadership, mentoring team members in best practices and cloud technologies, while aligning data engineering initiatives with strategic goals.
- Bachelor's degree or higher in Software Engineering, Computer Science, Engineering, or a related field.
- 3-5 years of experience in data engineering, with a proven history of designing and implementing complex data infrastructure.
- Proficient in Python, Scala, or Java, with experience in scalable, distributed systems.
- Strong knowledge of cloud computing platforms and related services like AWS Glue, Azure Data Factory, or Google Dataflow.
- Expertise in data modeling, schema design, and SQL query optimization for both relational and NoSQL databases.
- Excellent communication and leadership skills, with the ability to collaborate effectively with cross-functional teams and stakeholders.
Be The First To Know
About the latest Data integration specialist roles Jobs in United Arab Emirates !
Data Engineer
Posted today
Job Viewed
Job Description
Join to apply for the Data Engineer role at myZoi | Financial Inclusion Technologies
Join to apply for the Data Engineer role at myZoi | Financial Inclusion Technologies
myZoi is changing lives for the better for those who deserve it the most. We are an exciting fintech start-up aiming to promote financial inclusion globally. Our vision is to provide a level playing field to the unbanked and the underbanked in accessing essential financial services in an affordable, convenient, and transparent fashion. We are looking for smart, ambitious, and purpose-driven individuals to join us in this journey. Please apply via the link below if you are interested.
Data Engineer
Meet myZoi
myZoi is changing lives for the better for those who deserve it the most. We are an exciting fintech start-up aiming to promote financial inclusion globally. Our vision is to provide a level playing field to the unbanked and the underbanked in accessing essential financial services in an affordable, convenient, and transparent fashion. We are looking for smart, ambitious, and purpose-driven individuals to join us in this journey. Please apply via the link below if you are interested.
You will be working in our Data Platform team, providing data capability for internal and product requirements for myZoi. You will be proactive and innovative and you will be using 100% cloud technologies based on AWS and modern Open Source tooling to provide a real-time data infrastructure, allowing our teams to gain unprecedented insight into our wealth of application data. You will work with a world-class team of Data Analysts and Engineers to provide best in class solutions.
Architect AWS-Centric Data Solutions:
- Design and optimize high-performance data pipelines leveraging AWS native tools.
- Architect modular, AI-ready data lake with a roadmap to ensure secure ingestion, transformation, and consumption workflows.
- Implement scalable streaming solutions that factor in performance, scalability and cost.
Embed Security & Compliance Across AWS Workloads
- Build and enforce data governance protocols aligned with relevant regulatory and compliance requirements using AWS tools.
- Collaborate with cybersecurity teams to implement IAM best practices, encryption strategies, and secure networking.
- Maintain traceability and auditability for all data flows across the AWS stack.
Optimize for Observability & Cost Efficiency:
- Work with our Cloud Architect and SRE to deploy and fine-tune monitoring dashboards using Datadog and AWS CloudWatch for performance, anomaly detection, and security event correlation.
- Continuously evaluate storage and compute cost optimization across S3, EC2, Redshift, and Glue workloads.
Lead Through Influence and Collaboration:
- Partner with Data Science, Cloud Architect, Security and Engineering leads to align cloud architecture with evolving business goals and priorities to ensure future-readiness.
- Mentor junior engineers in AWS best practices, scalable design, and secure coding standards.
- Lead innovation across key Product initiatives.
Innovate with Purpose:
- Evaluate and integrate AWS-compatible orchestration tools like Airflow, Lakeformation, ECS, EKS or Managed Workflows.
- Contribute to middleware and third-party orchestration strategies through secure APIs and event-driven patterns.
- Design data products based on requirements that focus on key use cases such as social impact related.
- Bachelor's or Master's degree in Computer Science, Data Engineering, Software Engineering or a related field.
- 8–10 years of professional experience in data engineering, including 5+ years architecting on AWS underpinned by data governance. Mastery of AWS cloud services (S3, Lambda, Glue, Redshift, Kinesis, Lake Formation, Crawler etc.).
- Deep expertise in building scalable cloud-native solutions and managing secure data infrastructure ensuring data governance.
- Strong command of compliance-driven architecture design and real-time monitoring strategies.
- Good understanding of compliance frameworks related to data privacy and information security.
- Excellent communication skills and a proven leadership in mentoring and ability to lead cross-functional initiatives.
- Proficiency with agile tools (Jira).
- Cloud Infrastructure & AWS Services: S3, Glue, Lambda, Redshift, Kinesis, IAM, CloudWatch, Lake Formation etc. Strong awareness of AWS security tools.
- Data Orchestration: Experience with Apache Airflow on ECS or AWS Managed Workflows. Familiarity with Step Functions and event-driven orchestration patterns.
- Streaming & ETL Pipelines: Expertise in Kinesis Data Streams and Kafka (AWS-hosted or compatible). Proficiency in designing and optimizing ETL workflows using AWS.
- Monitoring & Observability: Awareness of or exposure to logs, alerting, monitoring, detection and tuning.
- Security & Governance: Awareness of or exposure to AWS KMS. In addition, building governance workflows with AWS Config and Lake Formation.
- Data Modeling & Optimization: Extensive experience in design of AI-ready data lake with scalable ingestion and query performance.
- Programming Languages: Advanced coding in Python and SQL. Experience in Java and ETL processes is also preferred.
- You have strong communication skills, curiosity and are a quick learner.
- You enjoy a creative fast paced agile world.
- You enjoy mentoring and teaching other developers to create a world class cohesive team.
- You enjoy making friends and having fun.
At myZoi we strive to create a both a product and a team that embraces equality, inclusion, diversity and freedom. We want people who can be themselves and bring their own brand of value to the team. Come and join us
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
Referrals increase your chances of interviewing at myZoi | Financial Inclusion Technologies by 2x
Sign in to set job alerts for "Data Engineer" roles.Dubai, Dubai, United Arab Emirates 15 hours ago
Specialist- Data Analyst - Statistics & Data AnalysisDubai, Dubai, United Arab Emirates 1 year ago
We're unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
The ENTERTAINER is a leading digital company dedicated to adding value for consumers by bringing them the best incentive offers globally. We are a 100 digital datadriven tech company providing firstrate offers across renowned dining leisure entertainment and hotel brands worldwide. The ENTERTAINER has grown with the aim of creating unbeatable value and loyalty everywhere we operate. We believe that experience is everything and thats why we are passionate about creating unforgettable experiences for our customers partners and employees.
About the Role
We are seeking a highly skilled and motivated Data Engineer to join our growing Data Engineering this role you will be instrumental in building and maintaining our nextgeneration data platform focusing on realtime and batch data processing leveraging the power of Databricks and Azure Synapse Analytics. You will work with a variety of data sources design robust data pipelines and ensure data quality and accessibility for analytics and business intelligence initiatives. Strong SQL expertise is critical for this role as you will be working extensively with structured and semistructured data.
As a Data Engineer you will
- Design develop and maintain scalable and reliable data pipelines using Databricks Azure Data Factory and related Azure services to ingest process transform and load data from diverse sources (e.g. databases APIs streaming platforms cloud storage).
- Build and optimize data models and data warehouses in Azure Synapse Analytics ensuring performance scalability and data integrity for analytical workloads.
- Develop and implement data quality checks monitoring and alerting systems to ensure data accuracy completeness and reliability across the data platform.
- Write complex and efficient SQL queries for data extraction transformation and analysis in Databricks (Spark SQL) Azure Synapse (TSQL) and MySQL environments.
- Collaborate with data analysts other engineers Product and other departments to understand data requirements provide data solutions and support datadriven initiatives.
- Optimize data pipelines and Synapse workloads for performance and costefficiency.
- Implement data security and governance best practices within the data platform.
- Document data pipelines data models and ETL processes clearly and comprehensively.
- Stay up to date with the latest technologies and best practices in data engineering Databricks Azure Synapse and cloud data platforms.
- Troubleshoot and resolve data pipeline and data quality issues.
- Contribute to the continuous improvement of our data platform architecture and infrastructure.
- Bachelors degree in computer science Data Science Engineering or a related field.
- 4 years of experience as a Data Engineer or similar role.
- Expertise in SQL and demonstrated ability to write complex optimized queries (SQL TSQL Spark SQL or similar).
- Strong handson experience with Databricks including Spark programming (Python Scala or Java) Delta Lake and Databricks Workflows/Jobs.
- Proven experience with Azure Synapse Analytics including data warehousing data modelling and performance tuning.
- Experience with Azure Data Factory or similar ETL/Data Orchestration tools.
- Solid understanding of data warehousing concepts data modelling principles (e.g. star schema snowflake schema) and ETL/ELT methodologies.
- Experience working with cloud environments particularly Microsoft Azure.
- Proficiency in at least one programming language such as Python or Scala.
- Experience with version control systems (e.g. Git).
- Strong problemsolving and analytical skills with a datadriven approach.
- Excellent communication collaboration and documentation skills.
- Competitive salary and benefits package.
- Opportunity to work with an energetic and innovative company.
- A chance to contribute to the success of a wellknown UAE brand.
Please Note:
This position is a 6month contract role and requires candidates to have their own valid UAE visa. Candidates without their own visa will not be considered for the role.
Required Experience:
Senior IC
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
Join to apply for the Data Engineer (m/f/d) role at Halian
Get AI-powered advice on this job and more exclusive features.
Job DescriptionWe are seeking a highly skilled Data Engineer to join our growing data team. This is an exciting opportunity to design, build, and maintain robust data pipelines and data visualization solutions that will drive critical business insights.
Responsibilities:- Design, develop, and maintain scalable ETL pipelines using SQL Server Integration Services (SSIS) and/or Informatica.
- Create and optimize data models for efficient data storage and retrieval.
- Develop interactive and insightful data visualizations and reports using Power BI Desktop and Power BI Service.
- Implement and manage Power BI gateways for secure and reliable data access.
- Write and optimize complex SQL queries, stored procedures, and perform database performance tuning.
- Collaborate with stakeholders to understand data requirements and deliver effective data solutions.
- 8-10 years of experience in the BI & Analytics domain with a focus on data engineering.
- Minimum 4 years of hands-on experience building ETL pipelines using SSIS and/or Informatica.
- Minimum 3 years of experience developing and deploying data visualizations using Power BI.
- Strong proficiency in database concepts, data modeling, and advanced SQL development.
- Proven track record of developing and deploying data integration and reporting solutions to production.
- Excellent analytical, problem-solving, and communication skills.
With over 28 years of experience, we understand that innovation is essential to providing agile, practical solutions that transform businesses and careers.
Our resourcing and smart services help you realize tomorrow's potential. Discover the amazing possibilities when you bring the right people and technologies together.
At Halian, we value diversity, equity, and inclusion (DEI). We are committed to connecting organizations with top talent from all backgrounds, ensuring everyone feels valued, respected, and empowered. We encourage applications from all qualified candidates, regardless of race, gender, disability, or other characteristics. By fostering diverse and inclusive workplaces, we drive innovation and better reflect the communities we serve.
Position Details- Location: Dubai, United Arab Emirates
- Seniority level: Mid-Senior level
- Employment type: Full-time
- Job function: Information Technology
- Industry: Staffing and Recruiting
Referrals increase your chances of interviewing at Halian by 2x.
Get notified about new Data Engineer jobs in Dubai, UAE.
#J-18808-Ljbffr