Skip to content
mimi

Sr Data Engineer

Insight Global

Vancouver · Hybrid Full-time Senior Yesterday

About the role

About

Insight Global is looking to hire a Sr. Data Engineer for a retail client based in Vancouver. This role is a hybrid position and requires 3 days onsite per week in Downtown Vancouver. You will be joining a team that works heavily with customer data—owning the customer data pipelines that gather data coming from multiple sources and consolidating that data for different use cases.

Some use cases include segmentation, data and analytics, advanced analytics for ML models to generate product recommendations, etc.

Responsibilities

  • Data pipeline creation and maintenance
  • Stack: GCP, Azure Cloud, Azure Databricks, Snowflake
  • Includes engineering documentation, knowledge transfer to other engineers, future enhancements, and maintenance
  • Create secure data views and publish them to the Enterprise Data Exchange via Snowflake for other teams to consume
  • Data pipeline modernization and migration via Databricks Delta Live Tables (DLT) and Unity Catalog
  • Leverage existing CI/CD processes for pipeline deployment
  • Adhere to PII encryption and masking standards
  • Orchestration tools: ADF, Airflow, Fivetran
  • Languages: SQL, Python
  • Data Modeling: Star and Snowflake schemas
  • Streaming: Kafka, Event Hubs, Spark, Snowflake Streaming
  • Dev Ops Support: Support improvements for current CI/CD processes
  • Production monitoring and failure support

Qualifications

  • 5+ years of experience as a Data Engineer
  • Experience with Azure Data Factory, Azure Databricks, Snowflake, and Storage Accounts
  • Experience with Python, specifically PySpark for developing and optimizing pipeline builds
  • Experience with DBT/DLT and working with Medallion Architecture
  • Experience with CI/CD principles and best practices, including Azure Dev Ops and Repos
  • Experience with Azure Cloud deployments and configurations
  • Experience with bug tracking and task management software such as JIRA
  • Experience managing outages, customer escalations, crisis management, and similar circumstances Azure certification
  • Experience with Terraform and ARM templates
  • Retail and/or e-commerce experience; experience working for a multi-channel retailer
  • Experience with Snowflake or any enterprise data warehouse

Equal Opportunity Statement

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances.

Application Assistance

If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to . To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
#J-18808-Ljbffr

Requirements

  • Experience with Azure Data Factory, Azure Databricks, Snowflake, and Storage Accounts
  • Experience with Python, specifically PySpark for developing and optimizing pipeline builds
  • Experience with DBT/DLT and working with Medallion Architecture
  • Experience with CI/CD principles and best practices, including Azure Dev Ops and Repos
  • Experience with Azure Cloud deployments and configurations
  • Experience with bug tracking and task management software such as JIRA
  • Experience managing outages, customer escalations, crisis management, and similar circumstances
  • Experience with Terraform and ARM templates
  • Retail and/or e-commerce experience; experience working for a multi-channel retailer
  • Experience with Snowflake or any enterprise data warehouse

Responsibilities

  • Data pipeline creation and maintenance
  • Includes engineering documentation, knowledge transfer to other engineers, future enhancements, and maintenance
  • Create secure data views and publish them to the Enterprise Data Exchange via Snowflake for other teams to consume
  • Data pipeline modernization and migration via Databricks Delta Live Tables (DLT) and Unity Catalog
  • Leverage existing CI/CD processes for pipeline deployment
  • Adhere to PII encryption and masking standards
  • Support improvements for current CI/CD processes
  • Production monitoring and failure support

Skills

ADFAirflowAzure CloudAzure DatabricksAzure Dev OpsDBTDLTEvent HubsFivetranGCPJIRAKafkaPythonPySparkSQLSparkSnowflakeSnowflake StreamingTerraformUnity Catalog

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free