Skip to content
mimi

Innovative Data Engineer Advanced Data Infrastructure Development

Clever Digital Marketing

Remote · Canada Full-time Mid Level Today

About the role

Position

Innovative Data Engineer for Advanced Data Infrastructure Development

Take charge as a Data Engineer, crafting scalable ETL pipelines while optimizing production‑grade infrastructure in a fully remote environment. Leverage your expertise in GCP and Python to transform data into valuable insights for growth.

This role is integral for scaling a GCP‑native data platform, with a focus on creating impeccable data pipelines and enhancing data quality. You’ll translate architectural visions into effective data solutions while working in a collaborative and fast‑paced atmosphere. Engage fully in building and managing data systems that support major advertising initiatives across industries.

Key Responsibilities

  • Build robust, scalable ETL/ELT data pipelines
  • Integrate event streaming for real‑time data processes
  • Own transformation logic within the Medallion Architecture
  • Manage data models in Big Query for optimal analytics
  • Implement data quality checks and monitoring systems

Requirements

  • 3–5+ years of data engineering experience
  • Strong GCP knowledge:
    • Big Query, Cloud Run, etc.
  • Proficient in SQL and Python programming
  • Experience with major ad platform APIs
  • Understanding of data warehousing principles

Elevate your data engineering career by powering complex data systems that drive real decisions and outcomes for clients across various sectors.

Reference

#J-18808-Ljbffr

Requirements

  • 3–5+ years of data engineering experience
  • Strong GCP knowledge (BigQuery, Cloud Run, etc.)
  • Proficient in SQL and Python programming
  • Experience with major ad platform APIs
  • Understanding of data warehousing principles

Responsibilities

  • Build robust, scalable ETL/ELT data pipelines
  • Integrate event streaming for real-time data processes
  • Own transformation logic within the Medallion Architecture
  • Manage data models in BigQuery for optimal analytics
  • Implement data quality checks and monitoring systems

Skills

Google Cloud Platform (GCP)BigQueryCloud RunSQLPythonEvent streamingMedallion ArchitectureData quality checksMonitoring systemsAdvertising platform APIsData warehousing

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free