V
Snowflake Data Engineer
VySystems
Canada · On-site Full-time Today
About the role
Responsibilities and Duties
- Lead design and implementation of Snowflake-based data architectures: schemas, data vault/house/star models, materialized views, and zero-copy cloning patterns for environments.
- Build and maintain production ETL/ELT pipelines into Snowflake using Snowpipe, Snowpark, Streams & Tasks and partner tools (Streamsets, dbt, Fivetran, Matillion, Airbyte, etc.).
- Develop Snowflake-native utilities and apps (Snowpark for Python, UDFs, external functions, and internal tools) to accelerate developer productivity and data product delivery.
- Optimize query performance and cost through clustering keys, partitioning strategies, resource monitors, warehouse sizing, and workload isolation.
- Implement data governance, security and access controls in Snowflake based on role-based access, masking policies, object tagging, data lineage and audit logging.
- Automate infrastructure and deployments leveraging IaC for Snowflake objects and cloud infra-CI/CD pipelines, and automated testing for SQL/Snowpark code.
- Build observability and operational tooling by monitoring, alerting, usage/cost reporting, and incident playbooks for Snowflake workloads.
- Mentor engineers, review designs and contribute to roadmap decisions for Snowflake platform evolution.
Required Skills and Experience
- Strong hands-on experience designing and operating Snowflake in production
- Deep experience with Snowflake features, like Snowpark, Streams & Tasks, Snowpipe, Time Travel, cloning, materialized views, external functions and user-defined functions.
- Hands-on ETL/ELT development experience with dbt, SQL, and one or more ingestion tools (Streamsets, Fivetran, Matillion, Airbyte, Kafka connectors).
- Proficient in Python (Snowpark/connector), SQL tuning and query optimization techniques.
- Experience with IaC and automation (Terraform, GitHub Actions, Jenkins, or equivalent).
- Strong knowledge of cloud platforms and native services (AWS, Azure or GCP) as they relate to Snowflake deployment and integrations.
- Solid understanding of medallion architecture, data modeling patterns, data governance, and secure data sharing.
- Demonstrated ability to implement CI/CD, automated testing and production operational practices for data workloads.
Preferred Qualifications
- Snowflake SnowPro Core or advanced Snowflake certifications.
- Experience with dbt (core or Cloud) for transformation and modular SQL engineering.
- Experience with data virtualization, data catalogs or data lineage tools.
- Familiarity with analytics and BI integrations (Looker, Tableau, Power BI) and building Snowflake-optimized semantic layers.
- Experience building internal developer tools or data apps using Snowpark or lightweight web frameworks.
Requirements
- Strong hands‑on experience designing and operating Snowflake in production.
- Deep experience with Snowflake features such as Snowpark, Streams & Tasks, Snowpipe, Time Travel, cloning, materialized views, external functions, and user‑defined functions.
- Hands‑on ETL/ELT development experience with dbt, SQL, and one or more ingestion tools (Streamsets, Fivetran, Matillion, Airbyte, Kafka connectors).
- Proficient in Python (Snowpark/connector), SQL tuning and query‑optimization techniques.
- Experience with IaC and automation tools (Terraform, GitHub Actions, Jenkins, or equivalent).
- Strong knowledge of cloud platforms and native services (AWS, Azure, or GCP) as they relate to Snowflake deployment and integrations.
- Solid understanding of medallion architecture, data‑modeling patterns, data governance, and secure data sharing.
- Demonstrated ability to implement CI/CD, automated testing and production operational practices for data workloads.
Responsibilities
- Lead design and implementation of Snowflake‑based data architectures, including schemas, data vault/house/star models, materialized views, and zero‑copy cloning patterns for environments.
- Build and maintain production ETL/ELT pipelines into Snowflake using Snowpipe, Snowpark, Streams & Tasks and partner tools such as Streamsets, dbt, Fivetran, Matillion, Airbyte, etc.
- Develop Snowflake‑native utilities and applications (Snowpark for Python, UDFs, external functions, internal tools) to accelerate developer productivity and data‑product delivery.
- Optimize query performance and cost through clustering keys, partitioning strategies, resource monitors, warehouse sizing, and workload isolation.
- Implement data governance, security, and access controls in Snowflake based on role‑based access, masking policies, object tagging, data lineage, and audit logging.
- Automate infrastructure and deployments leveraging IaC for Snowflake objects and cloud infra‑CI/CD pipelines, and automated testing for SQL/Snowpark code.
- Build observability and operational tooling by monitoring, alerting, usage/cost reporting, and incident playbooks for Snowflake workloads.
- Mentor engineers, review designs and contribute to roadmap decisions for Snowflake platform evolution.
Skills
SnowflakeSnowparkStreams & TasksSnowpipeMaterialized ViewsZero‑Copy CloningData Vault / Star / Snowflake ModelingSQLPythondbtETL/ELT Tools (Streamsets, Fivetran, Matillion, Airbyte, Kafka)TerraformGitHub ActionsJenkinsAWS / Azure / GCPData GovernanceCI/CDAutomated Testing
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free