KI
Data Engineer
Kasmo Inc
Oaks · Hybrid Full-time Today
About the role
About
Develop and maintain DBT models, macros, and SQL scripts to transform data within Snowflake. Optimize data models, design star/snowflake schemas, manage warehouse performance, and implement clustering/materialized views. Create scalable ELT/ETL pipelines to ingest and transform data from diverse sources. Write modular, testable SQL code using version control and manage DBT project structures. Implement data quality checks, automated tests, anomaly detection, and data security, including RBAC, masking, and row-level access in Snowflake.
Basic Qualifications
- 5+ years of experience with Snowflake and Strong SQL proficiency.
- 5+ years of experience with DBT and Hands-on experience developing with dbt.
- 3+ years of experience with Airflow development
- Good handle on Datawarehouse concepts.
- Education: Bachelor s degree in Computer Science, Data Engineering, or a related field
Nice to Have
- Programming: Proficiency in Python for scripting and automation.
- Cloud Platforms: Experience with AWS, GCP, or Azure environments.
Skills
AirflowDBTDatawarehousePythonSQLSnowflake
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free