Principal Engineer, Data Solutions
QuadReal Property Group
About the role
About QuadReal Property Group
QuadReal Property Group is a global real estate investment, development and operating company headquartered in Vancouver, British Columbia. Its assets under management are $98.5 billion. From its foundation in Canada as a full-service real estate operating company, QuadReal has expanded its capabilities globally for investments in equity and debt in both the public and private markets. QuadReal invests directly through operating platforms in which it holds an ownership interest and via programmatic partnerships.
QuadReal seeks to deliver strong investment returns while creating sustainable environments that bring value to the people and communities it serves. Now and for generations to come.
QuadReal: Excellence lives here.
Role Description:
The Principal Engineer, Data Solutions is a senior technical leadership position responsible for ensuring the quality, scalability, and effectiveness of data and analytics solutions within QuadReal's Enterprise Data & Analytics function. Reporting to the Director of Data Solutions, this role serves as the technical right-hand. They set standards, mentor team members, evaluate complex solutions, and deliver hands-on technical work for the organization's most challenging data problems.
This is an individual contributor role for a seasoned data professional who wants to have enterprise-wide impact through technical excellence, mentorship, and thought leadership without taking on people management responsibilities.
Responsibilities:
Technical Leadership & Standards (40%)
- Set the Technical Bar: Define and enforce standards for data modeling, analytics engineering, and BI development across all Data Solutions squads
- Solution Evaluation: Review and validate technical designs for major initiatives, ensuring architectural soundness and alignment with enterprise patterns
- Vendor Management: Evaluate proposals from consulting partners and technology vendors; hold vendors accountable for quality and delivery
- Quality Assurance: Conduct code reviews, model reviews, and technical assessments to maintain high standards in alignment with governance best practices
Hands-On Technical Delivery (35%)
- Build Reference Implementations: Create exemplary data models, dbt projects, and BI dashboards that serve as templates for the organization
- Solve Complex Problems: Tackle technical challenges beyond current team capabilities, from advanced SQL optimization to complex dimensional modeling
- Develop Data Models: Design and implement scalable, well-documented data models using dbt, Snowflake/Fabric, and modern analytics engineering practices
- Hands-On Coding: Write production SQL, Python, and data transformation logic daily (60%+ hands-on work)
Enablement & Mentorship (25%)
- Upskill Data Analysts/Product Managers: Teach effective due diligence, solution design, and technical requirement translation
- Mentor Analytics Engineers: Provide structured guidance on data modeling patterns, dbt best practices, and analytical thinking
- Conduct Training: Lead technical workshops, lunch-and-learns, and documentation initiatives
- Foster Technical Culture: Build a culture of engineering excellence, continuous learning, and quality-first thinking
Experience and Qualifications:
- 6-8+ years in analytics engineering, data engineering, business intelligence, or related technical roles
- Proven track record designing and implementing enterprise-scale data solutions ($1B+ organization preferred)
- Deep expertise in data modeling methodologies (dimensional modeling, data vault, metrics layers, etc.)
- Production experience with modern data stack tools: dbt, Snowflake/Databricks/Fabric, SQL, Python
- Hands-on BI development using Power BI, Tableau, or similar enterprise platforms
- Vendor/partner management experience evaluating and holding consulting firms or technology vendors accountable
- Mentorship experience upleveling junior and mid-level data professionals
Core Technical Expertise:
- Expert-level SQL (complex queries, optimization, window functions, CTEs)
- Data modeling and dimensional design (star schema, snowflake, data vault)
- dbt (data build tool) or equivalent transformation frameworks
- Modern cloud data warehouses (Fabric, Databricks)
- Power BI or Tableau (DAX, calculated fields, performance optimization)
- Python for data analysis and automation
- Apache Airflow or similar orchestration tools
- Git/version control and CI/CD practices
- Data quality, testing, and observability frameworks
- Data governance and security best practices
Key Competencies:
- Technical Judgment: Can evaluate solutions and articulate trade-offs clearly; knows when to be pragmatic vs. principled
- Teaching Ability: Can explain complex technical concepts to non-technical audiences and mentor others effectively
- Hands-On Mindset: Prefers writing code to drawing diagrams; leads by example
- Intellectual Curiosity: Stays current with modern data stack trends and technologies
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free