MC
Databricks Engineer (Data Engineer)
Myticas Consulting
Winnipeg · On-site Full-time Today
About the role
Qualifications
- Strong hands-on experience with Databricks (PySpark, Delta Lake, notebooks) – core requirement
- Proven ability to build and optimize ETL/ELT data pipelines in a lakehouse environment
- Experience with Azure Data Lake Storage (ADLS Gen2) for scalable data storage
- Hands-on development using Azure Data Factory (ADF) for orchestration and pipelines
- Experience with Azure Functions for serverless data processing
- Solid understanding of lakehouse architecture (Databricks + ADLS integration)
- Strong proficiency in Python and SQL for data transformation and pipeline logic
- Experience with data modeling, partitioning, and performance optimization
- Familiarity with Databricks Unity Catalog (data governance, access control)
- Experience integrating Databricks with Snowflake, APIs, or downstream BI systems
- Exposure to CI/CD pipelines (Azure DevOps, GitHub Actions) for data workflows
- Experience with infrastructure-as-code tools (Terraform or similar) is an asset
- Familiarity with orchestration tools (Airflow, dbt, or ADF pipelines)
- Strong communication skills with client-facing / consulting experience
Databricks certifications preferred (strong indicator of hands-on expertise)
Skills
ADFADLS Gen2AirflowAPIsAzure DevOpsAzure FunctionsDatabricksDelta LakedbtETLELTGitHub ActionslakehousePythonSQLSnowflakeTerraformUnity Catalog
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free