Skip to content
mimi

Principal Data Engineer

Questrade Financial Group

Canada · Hybrid Full-time Lead CA$140k – CA$160k/yr Today

About the role

Questrade Financial Group (QFG), through its companies - Questrade, Questbank, Questrade Wealth Management, Community Trust Company, Zolo, and Flexiti, provides securities and foreign currency investment, professionally managed investment portfolios, mortgages, real estate services, financial services and more. We use cutting-edge technology to help Canadians become much more financially successful and secure.

At QFG, we combine human-centric collaboration with AI-driven innovation to redefine financial services. The ideal candidate will be a catalyst for change, using AI to transform and deliver unparalleled customer experiences and shaping a future where AI empowers our teams to do their best work.

Join our diverse, inclusive, and hybrid workplace to unleash your creativity and nurture your curiosity without limits. If you share this sense of infinite possibility, come shape your future at QFG.

What’s in it for you as an employee of QFG?

  • Health & wellbeing resources and programs
  • Paid vacation, personal, and sick days for work-life balance
  • Competitive compensation and benefits packages
  • Work-life balance in a hybrid environment with at least 3 days in office
  • Career growth and development opportunities
  • Opportunities to contribute to community causes
  • Work with diverse team members in an inclusive and collaborative environment

This job posting is for an existing vacancy

We’re looking for our next Principal Data Engineer. Could It Be You?

The ideal Principal Data Engineer will be an experienced professional ready to work in an agile environment. This role requires in-depth knowledge and understanding of data ingestion, orchestration, compute, automation, and modeling, particularly within the high-velocity domain of brokerage technology and Digital Investing Engineering.

Need more details? Keep reading…

Responsibilities

  • Design, develop, and maintain robust, scalable, and high-performance brokerage data pipelines and ETL/ELT processes, ensuring data quality, integrity, and timely availability for consumption.
  • Spearhead the creation of innovative data products and cross-domain data assets that align with QuestEnterprise's top-line OKRs, specifically supporting the high-velocity demands of Digital Investing Engineering.
  • Act as a technical leader and subject matter expert on data architecture, modeling, and best practices, driving the modernization of data infrastructure to leverage cutting-edge cloud technologies (GCP/Databricks).
  • Drive end-to-end automation of data workflows, monitoring, alerting, and deployment processes to enhance operational efficiency and reliability.
  • Enable and operationalize AI tooling and machine learning pipelines in close collaboration with Data Science and ML Ops teams, translating complex models into production-ready data flows.
  • Provide expert data consultation and enablement for self-servicing capabilities, empowering business analysts and stakeholders with tools (e.g., PowerBI, Looker) to access and derive insights independently.
  • Serve as the primary liaison between business stakeholders, software engineering, data science/ML Ops, and Enterprise data/AI enablement teams, translating business needs into technical data solutions.
  • Support audits and operational due diligence by ensuring comprehensive data lineage, governance, security, and compliance across all data products and infrastructure.
  • Mentor and coach junior and intermediate data engineers, fostering a culture of engineering excellence, continuous learning, and technical innovation within the team.

Qualifications

So are YOU our next Principal Data Engineer? You are if you…

  • 8+ years of progressive experience in the data engineering field.
  • Expert-level proficiency with GCP data engineering services including BigQuery, Dataflow, Airflow (or Cloud Composer), Pub/Sub, Data Catalog, and CloudSQL, or equivalent expertise with Databricks.
  • Demonstrated experience with Relational Data Stores such as MSSQL or MySQL.
  • Strong knowledge of SQL and Python.
  • Experience in data modeling for both On-Premises and Cloud consumption. This includes expertise in technical architecture, infrastructure, and robust ETL/ELT pipeline development, with a focus on data ingestion, orchestration, and compute optimization

Technical Leadership and AI/BI Specialization

  • Spearhead the implementation of self-service Business Intelligence (BI) solutions, leveraging tools such as PowerBI and Looker, or advanced technologies like conversational AI agents (e.g., Google Cloud BigQuery and Databricks AI agents).
  • Practical experience and awareness of leveraging generative AI developer tools (e.g., Claude, Cursor, Github Copilot) to significantly boost coding efficiency, accelerate development, and enhance data pipeline quality.
  • Act as a technical leader to enable the adoption of new technologies both within the immediate team and across the broader organization.
  • Work in close collaboration with Solution Architects and Data Science teams to design and refine data ingestion pipelines and define comprehensive data modeling strategies for consumption.
  • Collaborate with the team to strategically decide on the most appropriate tools and methodologies for various data integration scenarios.

Project Management and Mentorship

  • Verifiable track record of successfully leading multiple concurrent projects, including proactively troubleshooting technical challenges and efficiently resolving production issues in a timely manner with the team.
  • Provide guidance and mentorship to new and current team members to facilitate their upskilling and professional growth.
  • Proven ability to thrive in ambiguity, effectively prioritize competing needs, and consistently deliver results in a dynamic, fast-paced environment.

Communication and Stakeholder Influence

  • Exceptional presentation and communication skills (e.g., PowerPoint, Google Slides).
  • Ability to communicate effectively and influence a diverse group of stakeholders, including external engineering teams, product development teams, business stakeholders, and external partners.
  • Capability to participate in and present novel technologies or concepts during enterprise-wide forums (e.g., QuestTalk).

Good to have skills:

  • Databricks
  • Worked in SAFe - Agile development process
  • Design, document and develop complex data pipelines and cross domain data products
  • Knowledge of the Financial industry (Investment & Trading/Brokerage Technology)
  • GCP - Google Cloud Professional Data Engineer Certification preferred

Compensation Information:

  • Base salary range: $140,000 - $160,000
  • The final compensation package will be commensurate with the successful candidate's experience, skills, and geographic location (Canada). It includes a comprehensive benefits plan and a competitive incentive (bonus) program for Full-Time Permanent roles.

Sounds like you? Click below to apply!

At Questrade Financial Group of Companies, with multiple office locations around the world, we are committed to fostering a diverse, inclusive and accessible work environment. This is an environment where individuals are treated with dignity and respect. Here, the unique skills and experience you bring will be valued. You will be supported and motivated, so that you can harness your unlimited potential. Our team reflects the diversity of the communities we serve and operate in. Having a collaborative and diverse team helps us push boundaries to bring the future of fintech into existence—not only for the benefit of our customers, but for those who build their career with us.

Questrade Financial Group of companies Applicant Tracking System utilizes artificial intelligence (AI) for application screening. The AI system operates on predetermined criteria, with final decisions subject to human review.

Candidates selected for an interview will be contacted directly. If you require accommodation during the recruitment/selection process, please let us know and we will work with you to meet your needs.

Skills

AirflowBigQueryCloud ComposerCloudSQLCursorDatabricksData CatalogDataflowGithub CopilotGCPLookerML OpsMSSQLMySQLPowerBIPub/SubPythonSQL

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free