Snowflake Data Platform Engineer
TMC
About the role
Overview
We are currently looking for an experienced Snowflake Data Platform Engineer to contribute to the development of a modern cloud-based data platform built around Snowflake and AWS .
In this role, you will play a key part in designing, implementing, and operating scalable data solutions, ensuring the platform remains secure, performant, and reliable. Working closely with data engineers, analysts, and platform architects, you will help enable data-driven decision-making across the organization.
This position focuses on hands-on engineering and platform implementation, supporting the overall architecture while driving best practices in automation, governance, and data modelling.
Responsibilities
- Build, maintain, and optimize Snowflake data structures, schemas, and pipelines
- Develop and orchestrate data ingestion and transformation workflows using tools such as Dagster or Airflow
- Implement Infrastructure as Code (IaC) practices using Terraform to manage Snowflake resources and environments
- Configure and maintain secure authentication and access control mechanisms (OAuth / OIDC)
- Design and maintain data models using dbt, leveraging dimensional modelling as well as Data Vault methodologies
- Ensure performance optimization and cost efficiency across Snowflake workloads
- Integrate and transform data coming from multiple internal and external data sources
- Collaborate with cross-functional teams to deliver robust and reliable data services
- Monitor platform health and proactively improve stability, performance, and data quality
Required Background
- Bachelor’s degree in Computer Science, Engineering, Information Technology, or equivalent experience
- 5+ years of experience working with modern cloud-based data platforms
- Strong practical expertise with Snowflake development and platform management
Core Technical Skills
- Advanced Snowflake usage (query optimization, warehouse configuration, data pipelines)
- Strong experience with dbt for modelling, testing, and deployment
- Knowledge of Data Vault 2.0 and dimensional modelling techniques
- Experience implementing Terraform-based infrastructure automation
- Familiarity with workflow orchestration tools such as Dagster or Airflow
- Solid understanding of AWS services, particularly S3 and IAM
- Knowledge of secure authentication patterns, including OAuth
- Programming experience with Python and/or Java
- Strong understanding of ETL pipelines, data warehousing, and data lifecycle management
Nice to Have
- Exposure to Denodo or similar data virtualization tools
- Experience working with Kafka or real-time streaming pipelines
- Background in commodities trading or capital markets environments
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free