Skip to Content

Senior Data Engineer

United Arab Emirates, Dubayy, Dubai

Job description

About Company

The GAC Group is a privately-owned company, specialising in the delivery of high-quality shipping, logistics and marine services to customers worldwide. Emphasising a long-term approach, innovation, ethics and a strong human touch, GAC delivers a flexible and value-adding portfolio to help you achieve your strategic goals. You can read more about the company at


We are looking for a skilled Senior Data Engineer who has worked in Kimball Methodology to build warehouses and worked extensively in Big Data using pyspark join our Analytics Team. The ideal candidate will have experience in Data Warehousing, Big Data and Ingestion, and should have Cloud exposure where Azure is preferable. Good experience in use of open source technologies for Ingestion is expected. They should have used data lake for analytics in their previous projects.

The location of the Job is Dubai, United Arab Emirates. Day-to-day job activities includes ingestion of data from various sources to data lake and preparing it for data analytics, building data warehouses, modeling, and developing ETL code. They will report to the BI and Analytics Managers and will mostly working with Azure and Open source technologies in cloud to get analytics.

Job requirements

Basic Qualifications

  • · 4-8 years’ experience in Data Warehousing and Big Data technologies and frameworks
  • · 4 years with Building Warehouses and 4 years working with Big Data technologies is preferred.
  • · Bachelor’s degree or higher in quantitative or technical fields
  • Excellent written and spoken English skills - the ability to communicate is a must.
  • The ability to work in a dynamic, fast-paced, work environment.
  • Self-motivated with the ability to work under minimal supervision.
  • Shipping, Logistics and Marine domain experience is preferred but not necessary.

Job Requirements:

  • Hands-on experience on Kimbal Methadology, Building datawarehousing and data modelling.
  • Must have worked on ETL tools to load the data.
  • Hands-on experience on Big Data technologies and frameworks. Hive, Spark, Hadoop, SQL on Big Data, Redshift, Azure SQL warehouse/Synapse, etc.
  • Experience with the following tools and technologies:
    • Data Factory, Python, Git, Data Bricks
    • SQL Server
    • Power BI including data modelling and DAX development
  • Azure Platform
    • Linux OS Concepts
    • Data Modelling – KIMBAL Concepts, ETL Knowledge
    • Git, Azure Dev Repos
    • Hadoop, Spark, Kafka,
    • Relational SQL and NoSQL databases
    • Data pipeline/workflow management tools such as Azkaban and Airflow
    • Azure cloud – Data Factory, HD Insights, Data Bricks
    • Stream-processing systems such as Storm and Spark-Streaming
    • Object-oriented/object function scripting languages such as Python, R
  • Work closely with other Data and Analytics team members to optimise the company’s data systems and pipeline architecture.
  • Design and build the infrastructure for data extraction, preparation, and loading of data from a variety of sources using technology such as SQL and Azure.
  • Build data and analytics tools that will offer deeper insight into the pipeline, allowing for critical discoveries surrounding key performance indicators and customer activity.
  • Always angle for greater efficiency across all of our company data systems.
  • Experience working with and extracting value from large, disconnected and unstructured datasets.
  • Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management.
  • Strong interpersonal skills and ability to project manage and work with cross-functional teams.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.