Data Engineer
Cartier International AG
About the role
The story of Cartier is founded on curiosity and passion. For more than 170 years we have embraced a bold, pioneering spirit that continues to inspire our teams across all Métiers from our boutiques to our workshops and corporate offices. Our 10,000+ colleagues of more than 105 nationalities are united by a shared independent spirit and commitment to excellence, striving to continuously enrich our Maison’s heritage by pushing the boundaries of creativity and innovation.
“Join us to unleash the power of #data, leverage our cloud platforms and deploy at scale state-of-the-art initiatives! You will be part of a young, dynamic and talented team working on analytics and artificial intelligence with real-life, impactful business applications.” — Thomas M.; Data Officer
As a Data Engineer, you’ll be responsible for building and maintaining the data and BI infrastructure that powers our analytics and ML initiatives. You’ll design scalable data pipelines, ensure data quality, enable self-service analytic capabilities and build BI products across the organization.
HOW WILL YOU MAKE AN IMPACT ?
"The data engineer sits at the foundation of the data stack, ensuring that every insight in Looker is powered by a robust, scalable, and governed pipeline."
Navigating between technical infrastructure and business intelligence, you will have the opportunity to design the data models that drive business decisions across the Maison.
YOUR MAIN MISSIONS
Within the Cartier Data Office, you will build business-ready data models, high-performance infrastructure, and large-scale reporting dashboards.
Analytics Capabilities with Looker
- Configure and maintain Looker dashboards and explores, translating complex business questions into actionable insights.
- Train "viewers" and "explorers" on how to get the most out of the platform.
Data Modeling & Transformation
- Design and implement dbt models and dimensional data marts, applying software engineering best practices (version control, testing) to the analytics layer.
Pipeline Engineering
- Build and maintain ETL/ELT pipelines using Airflow/Apache Composer, ensuring seamless data ingestion into BigQuery.
Platform Operations
- Monitor data platform health, troubleshoot quality issues, and implement data governance and access controls.
Documentation
- Maintain data definitions and lineage, ensuring Looker remains a "single source of truth" for the organization.
HOW WILL YOU EXPERIENCE SUCCESS WITH US ?
- You have a Master’s degree in Computer Science, Data Engineering, or a related scientific field and 3-5 years of experience in data engineering or analytics engineering.
- You have a strong proficiency in Looker (LookML) or PowerBI, and SQL/DBT. You know how to build sophisticated dashboards and the transformation layers that power them.
- You have a deep understanding of Google Cloud Platform (BigQuery) and are expert in SQL and Python.
- You have experience building pipelines with Airflow/Apache Composer.
- You like to work in a fast-paced environment, quickly delivering new features for demanding business users using an Agile/DevOps operating model.
HOW DO WE KEEP YOU SMILING ?
- In a young and dynamic team focused on personal development, you will take part in Cartier’s data journey, leveraging state-of-the-art tool.
- You will work with multicultural stakeholders from different business units for a leading Company in the luxury industry.
- You will have the opportunity to mentor team members as our data maturity grows.
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free