Skip to content
mimi

Data Engineer - strong Intermediate / Sr level

Canada · Hybrid Contract Senior Today

About the role

About

This role is for the public sector and requires candidates to be ready for a deep-dive backcheck clearance similar to security clearance. Candidates must have 5 years of verifiable residency in Canada, PR or CAN status, a clear criminal record. Canadian security clearance of any level is a nice-to-have.

Technical Stack

  • Microsoft Fabric
  • Azure Data Factory, Azure Synapse (Notebooks, Pipelines)
  • Azure Cloud Platform
  • Python, SQL
  • SSIS
  • Power BI
  • CI/CD pipelines
  • IaC Infrastructure as Code (Terraform or equivalent)

Requirements

  • 7-10 years of experience designing and maintaining cloud‑based data warehouses
  • 7+ years building scalable ELT pipelines using Microsoft Fabric or Azure data services
  • Expert‑level proficiency in Python and SQL
  • Hands‑on experience designing and developing SSIS packages
  • Strong background in data modeling (dimensional, star, snowflake schemas)
  • Experience supporting BI and analytics platforms (e.g., Power BI)
  • Proven experience implementing CI/CD for data pipelines and infrastructure‑as‑code
  • Microsoft Azure Data Engineer Associate (DP‑203) or equivalent (highly preferred)

Responsibilities

  • Design, develop, and maintain reliable ELT pipelines on Azure and Microsoft Fabric
  • Lead or contribute to end‑to‑end delivery of enterprise data solutions, from assessment through deployment
  • Develop and optimize complex Python and SQL transformations
  • Maintain orchestration workflows supporting batch and near‑real‑time processing
  • Implement CI/CD pipelines to support automated deployment and version control
  • Produce and maintain technical artifacts including architecture diagrams, solution designs, and ETL documentation
  • Ensure data solutions align with data management standards, security, privacy, and governance requirements
  • Support testing, validation, benchmarking, and data quality monitoring
  • Create documentation covering data models, lineage, mappings, business rules, and transformations
  • Contribute to knowledge sharing and data literacy initiatives across teams

Additional Information

  • Duration: 6mo (possible extension)
  • Hours: 35 hours/week

Skills

Azure Cloud PlatformAzure Data FactoryAzure SynapseCI/CD pipelinesIaC Infrastructure as CodeMicrosoft FabricPower BIPythonSQLSSISTerraform

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free