Data Engineer
AutoCanada
About the role
About the Role
AutoCanada Inc. is seeking a talented and driven Data Engineer to join our Information Management team. This role plays a key part in designing, building, and maintaining scalable, high‑performance data solutions that enable analytics, reporting, and business intelligence across the organization. The ideal candidate brings hands‑on experience with modern Azure data platforms and a strong understanding of data engineering best practices.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Azure‑based technologies
- Build and optimize data solutions leveraging Azure Databricks, Azure Synapse Analytics, and Microsoft Fabric
- Develop, maintain, and optimize complex T‑SQL queries, stored procedures, and database objects
- Develop and manage ETL & orchestration workflows using Synapse Pipelines, Fabric Pipelines, and dbt
- Design and implement scalable data models (e.g., dimensional/star schemas) to support analytics and reporting
- Implement and manage CI/CD pipelines to support efficient and reliable data engineering workflows
- Monitor, troubleshoot, and optimize data pipelines and system performance
- Ensure data quality, integrity, governance, and security across all platforms
- Document technical designs, data flows, and operational processes
- Collaborate with data analysts, engineers, and business stakeholders to deliver high‑quality, reliable data solutions
Required Qualifications
- 2–5 years of experience in data engineering, with hands‑on expertise in:
- Azure Databricks
- Azure Synapse Analytics
- Microsoft Fabric
- T‑SQL (complex query development, stored procedures, and performance tuning)
- Proven experience building and maintaining scalable data pipelines
- Strong understanding of data warehousing concepts and ETL/ELT methodologies
- Experience working with cloud‑based data architecture, including Lakehouse models
- Experience with Git and CI/CD practices (Azure DevOps)
- Experience with dbt (data build tool) is a plus
- Microsoft Fabric Data Engineer Associate certification is a plus
- Industry experience in automotive retail environments is an asset
Core Competencies
- Strong communication and stakeholder collaboration skills
- Ability to work both independently and within cross‑functional teams
- Excellent organizational and time management abilities
- Ability to adapt to evolving business needs in a supportive and collaborative environment
Why Join AutoCanada Inc.?
- Work with modern, cloud‑based data technologies at scale
- Be part of a collaborative, forward‑thinking team
- Opportunities for professional growth and career development
- Competitive compensation and comprehensive benefits
The Perks
- Competitive Compensation and Benefits Package
- Employee Vehicle Purchase & Service Plans
- Employee and Family Assistance Programs
- Company‑wide appreciation events and contests throughout the calendar year
- Professional development and the opportunity to grow your career
How to Apply
Interested candidates are encouraged to submit their resume and a cover letter highlighting relevant experience.
AutoCanada Inc. is an equal opportunity employer committed to fostering a diverse and inclusive workplace.
Requirements
- Azure Databricks
- Azure Synapse Analytics
- Microsoft Fabric
- T-SQL (complex query development, stored procedures, and performance tuning)
- Proven experience building and maintaining scalable data pipelines
- Strong understanding of data warehousing concepts and ETL/ELT methodologies
- Experience working with cloud-based data architecture, including Lakehouse models
- Experience with Git and CI/CD practices (Azure DevOps)
Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Azure-based technologies
- Build and optimize data solutions leveraging Azure Databricks, Azure Synapse Analytics, and Microsoft Fabric
- Develop, maintain, and optimize complex T-SQL queries, stored procedures, and database objects
- Develop and manage ETL & orchestration workflows using Synapse Pipelines, Fabric Pipelines, and dbt
- Design and implement scalable data models (e.g., dimensional/star schemas) to support analytics and reporting
- Implement and manage CI/CD pipelines to support efficient and reliable data engineering workflows
- Monitor, troubleshoot, and optimize data pipelines and system performance
- Ensure data quality, integrity, governance, and security across all platforms
- Document technical designs, data flows, and operational processes
- Collaborate with data analysts, engineers, and business stakeholders to deliver high-quality, reliable data solutions
Benefits
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free