Skip to content
mimi

Data Engineer

Marc Ellis

On-site Mid Level Today

About the role

Job Description

1. JOB DETAILS:

Job Title: Data Engineer Department: Information Technology (IT)

Reports to: Senior Manager – MDM Prepared By: Senior Manager – MDM

Grade of Reporting Manager: FC09 Prepared On: January 2026

Proposed Grade: FC07 Evaluated On:

Upgrade / New Role: New HoD Vice President. – IT

2. Business unit Purpose:

The purpose of the Data Platform department is to provide a trusted, scalable, and secure foundation for data that enables the business to make faster, better-informed decisions and build data-driven products.

3. Job Purpose: One sentence describing the overall objective of the job and, essentially, why the job exists.

Data Engineer with strong data modeling and pipeline development skills, along with hands-on experience supporting Power BI reporting and analytics. The ideal candidate will design, build, and optimize scalable data solutions that enable reliable dashboards, self-service analytics, and data-driven decision-making across the organization

4. Job Dimensions and Scope: The significant elements on which the job has some direct or indirect impact. It provides data which gives an indication of the scope and scale of the job.

Direct Reports: 0

Indirect Reports: 0

Impact on Revenue: (Please Explain the Impact) 0

Impact on Cost: (Please Explain the Impact) 0

Impact on Employee (count): 1

5. Organisation Structure: Org chart for the Business Unit: Org chart should clearly depict the layout of the Business unit. Please highlight the Manager and the Manager’s manager. (Please provide the complete span of control in-case of managerial roles)

Chief Executive Officer

Vice President – IT

Senior Manager Master Data

Data Engineer

6. Key Result Areas: What are the critical accountabilities of the job? Write in bullet points and focus on the results that are expected from the job. In each statement (eight to ten in total), please indicate what has to be done, within which area or framework, and with what end result (e.g. “Prepare, gain agreement for, and implement, capital expenditure plans which ensure that future production/operation needs are met within acceptable cost limits”). Reference should be made to: contribution to business strategy; planning; budgeting; operations; team leadership; team contribution; challenges; latitude for decision making etc. Against each area of responsibility mention the measures for the end result i.e. mention the key indicators to track performance

Design, build, and maintain scalable data pipelines for ingesting data from multiple sources (databases, APIs, files, SaaS platforms) Develop and manage data warehouses / lakehouses using modern architectures (Delta Lake, star schema, medallion layers) Transform raw data into analytics-ready datasets optimized for Power BI consumption Collaborate with analysts and business stakeholders to understand reporting requirements and translate them into reliable data models Create and maintain semantic models and optimized datasets for Power BI Ensure data quality through validation checks, monitoring, and reconciliation processes Implement incremental loads, CDC patterns, and data refresh strategies Optimize query performance for Power BI (model design, aggregations, partitions, DAX efficiency) Support production deployments, troubleshoot data issues, and improve pipeline reliability Document data models, pipelines, and best practices

7. Job Context: A general commentary of any aspects of the job that are relevant and need more explanation e.g. current projects, operating context/environment, key challenges, economic climate etc.

Data Engineer with strong data modeling and pipeline development skills, along with hands-on experience supporting Power BI reporting and analytics. The ideal candidate will design, build, and optimize scalable data solutions that enable reliable dashboards, self-service analytics, and data-driven decision-making across the organization

8. Knowledge, Skills & Minimum Experience: What does it take to deliver the Key Result Areas in terms of knowledge/qualification, specific experience, technical and /or management skills, etc.? This should be based on the requirement for the job and should not be confused with the actual qualification & experience of the current incumbent.

Education Qualification: Bachelor’s degree or similar through experience. You have at least 4+ years of proven experience in a Data Engineer or similar role; You must have 4+ years of experience with Azure Data Factory / Synapse / Databricks/ Microsoft Fabric & Power BI

Work Experience: Strong experience as a Data Engineer or similar role Proficiency in SQL (advanced querying, performance tuning) Experience with Power BI: Dataset and semantic model design Star schema modeling for analytics Incremental refresh and refresh optimization Understanding of DAX performance (even if not writing complex DAX daily) Advanced DAX and Power BI report optimization Experience building ETL/ELT pipelines using tools such as: Azure Data Factory / Synapse / Databricks/ Microsoft Fabric Apache Spark / PySpark Airflow or similar orchestration tools Hands-on experience with data lakes, data warehouses, or lakehouse architectures Knowledge of cloud platforms (Azure, AWS, or GCP)

Skills: Data analysis Database management: Problem-solving Must be familiar with data structures, systems tables, error messages, and have experience in system problem resolution Ability to communicate effectively, concisely, and logically in a timely manner and at an appropriate level, while maintaining confidentiality Excellent organizational skills to function effectively under time constraints and within established deadlines, with particular attention to detail Conflict resolution experience Flexible & adaptable through organizational growth

9. COMPETENCY

Competency level

Requirements

  • null

Responsibilities

  • Design, build, and maintain scalable data pipelines for ingesting data from multiple sources (databases, APIs, files, SaaS platforms)
  • Develop and manage data warehouses / lakehouses using modern architectures (Delta Lake, star schema, medallion layers)
  • Transform raw data into analytics-ready datasets optimized for Power BI consumption
  • Collaborate with analysts and business stakeholders to understand reporting requirements and translate them into reliable data models
  • Create and maintain semantic models and optimized datasets for Power BI
  • Ensure data quality through validation checks, monitoring, and reconciliation processes
  • Implement incremental loads, CDC patterns, and data refresh strategies
  • Optimize query performance for Power BI (model design, aggregations, partitions, DAX efficiency)
  • Support production deployments, troubleshoot data issues, and improve pipeline reliability
  • Document data models, pipelines, and best practices

Benefits

null

Skills

Data analysisDatabase managementProblem-solvingData structuresSystem problem resolutionCommunicationOrganizational skillsConflict resolutionFlexibilityAdaptability

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free