Skip to content
mimi

Data Engineer New Hybrid

Motive do

Vancouver · On-site Full-time Mid Level Today

About the role

About Motive

Motive empowers the people who run physical operations with tools to make their work safer, more productive, and more profitable. For the first time ever, safety, operations and finance teams can manage their drivers, vehicles, equipment, and fleet related spend in a single system. Combined with industry leading AI, the Motive platform gives you complete visibility and control, and significantly reduces manual workloads by automating and simplifying tasks.

Motive serves nearly 100,000 customers – from Fortune 500 enterprises to small businesses – across a wide range of industries, including transportation and logistics, construction, energy, field service, manufacturing, agriculture, food and beverage, retail, and the public sector.

Role Overview

As a Data Engineer on our BI team, you’ll be a key player in Motive’s growth, delivering the data infrastructure for the AI era. You’ll act as the essential link between complex data and key business domains, delivering the high-quality datasets and semantic models that drive global strategy. This is an exciting opportunity to implement cutting-edge tooling, leverage AI to enhance your workflow, and master a modern data stack in a fast-evolving environment.

This is the perfect role for a "jack of all trades" data practitioner. You’ll design data models, build robust pipelines, manage DevOps and automated systems, and implement AI-driven data tooling. You’ll even get your hands dirty building dashboards and performing deep-dive analysis. If you love working full-stack and owning the entire data lifecycle, you’ll love this role.

What You'll Do

  • Collaborate & Strategize: Partner closely with business stakeholders to understand their challenges and design end-to-end architecture that solves complex business problems.
  • Build & Maintain Data Models: Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
  • Orchestrate & Automate: Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform to ensure data is fresh, trustworthy, and infrastructure is version‑controlled.
  • Champion Data Quality: Implement rigorous testing, documentation, and data governance practices to maintain a single source of truth.
  • Enable Analytics & Workflows: Act as the Product Owner and Tech Lead for your data domains, taking responsibility for the end-to-end data product delivery– from raw ingestion to data models enabling analytics and data apps in tools like Tableau and Retool.
  • Innovate with AI: Help us build our next‑generation data infrastructure by integrating AI capabilities (like Snowflake Cortex AI) to democratize analytics and empower the business.
  • Architect Observability: Implement monitoring and alerting frameworks (e.g., dbt packages or Monte Carlo monitors) to proactively catch "silent" data failures before stakeholders do.

What We're Looking For

  • 6+ years of experience in Analytics Engineering, Data Engineering, or a similar role.
  • Deep expertise in SQL and developing complex data models for analytical purposes (e.g., dimensional modeling).
  • Hands‑on experience with:
    • Data Warehousing: High proficiency in Snowflake (preferred) and experience with Open Table Formats like Iceberg.
    • Data Transformation: dbt
    • Orchestration & ETL: Airflow, Fivetran, Airbyte
    • Cloud Platform: AWS
    • Programming/Ingestion: Python
    • Infrastructure as Code: Terraform
    • AI‑Augmented Development: Proficiency using AI coding assistants (Cursor, Copilot, or Claude) to accelerate development and automate routine tasks.
  • A strong analytical mindset with a proven ability to solve ambiguous business problems with data.
  • Excellent communication skills and experience working cross‑functionally.
  • Self‑starter with the ability to self‑project manage work.
  • A user focus with the ability to understand how a data consumer will use the data products you build.

Bonus Points (Nice-to-Haves)

  • Experience building semantic models for natural la

Requirements

  • 6+ years of experience in Analytics Engineering, Data Engineering, or a similar role.
  • Deep expertise in SQL and developing complex data models for analytical purposes (e.g., dimensional modeling).
  • High proficiency in Snowflake (preferred) and experience with Open Table Formats like Iceberg.
  • Experience with dbt.
  • Experience with Airflow, Fivetran, Airbyte.
  • Experience with AWS.
  • Experience with Python.
  • Experience with Terraform.
  • Proficiency using AI coding assistants (Cursor, Copilot, or Claude) to accelerate development and automate routine tasks.
  • A strong analytical mindset with a proven ability to solve ambiguous business problems with data.
  • Excellent communication skills and experience working cross-functionally.
  • Self-starter with the ability to self-project manage work.
  • A user focus with the ability to understand how a data consumer will use the data products you build.

Responsibilities

  • Partner closely with business stakeholders to understand their challenges and design end-to-end architecture that solves complex business problems.
  • Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
  • Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform to ensure data is fresh, trustworthy, and infrastructure is version-controlled.
  • Implement rigorous testing, documentation, and data governance practices to maintain a single source of truth.
  • Act as the Product Owner and Tech Lead for your data domains, taking responsibility for the end-to-end data product delivery– from raw ingestion to data models enabling analytics and data apps in tools like Tableau and Retool.
  • Help us build our next-generation data infrastructure by integrating AI capabilities (like Snowflake Cortex AI) to democratize analytics and empower the business.
  • Implement monitoring and alerting frameworks (e.g., dbt packages or Monte Carlo monitors) to proactively catch "silent" data failures before stakeholders do.

Skills

AWSAirbyteAirflowClaudeCopilotCursordbtFivetranIcebergPythonSQLSnowflakeTerraform

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free