Skip to content
mimi

Senior Data Engineer (GCP, Databricks)

BURGEON IT SERVICES

Toronto · Hybrid Contract Senior 1w ago

About the role

Role Overview

We are looking for a Senior Data Engineer to design and build scalable batch and real-time data pipelines using modern cloud and big data technologies.

Key Skills (Must Have)

  • GCP (Google Cloud Platform)
  • Databricks
  • Python
  • Kubernetes
  • Apache Spark (Spark Streaming)
  • Kafka / Flink (real-time streaming)
  • Hadoop ecosystem (Hive, Pig, Spark)

Core Responsibilities

  • Build and maintain data pipelines (batch & real-time)
  • Develop streaming solutions using Spark/Kafka/Flink
  • Implement Data Lakehouse & Medallion architecture
  • Work with microservices-based data platforms
  • Set up and manage CI/CD pipelines & DevOps workflows
  • Collaborate with stakeholders and ensure data quality

Nice to Have

  • Unity Catalog
  • Terraform
  • Experience with AI coding tools (GitHub Copilot, Claude)

Skills

Apache SparkDatabricksFlinkGCPHadoopHiveKafkaKubernetesPigPythonSpark Streaming

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free