Skip to content
mimi

Snowflake Azure Data Engineer-

Tredence

Bengaluru · On-site Full-time Yesterday

About the role

Job Location

  • Kolkata
  • Chennai
  • Pune
  • Bangalore
  • Gurugram

Experience

  • 6 to 14 years

Notice Period

  • 30 days / 60 days / 90 days

Primary Roles and Responsibilities

  • Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF.
  • Ability to provide solutions that are forward‑thinking in data engineering and analytics space.
  • Collaborate with DW/BI leads to understand new ETL pipeline development requirements.
  • Triage issues to find gaps in existing pipelines and fix the issues.
  • Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs.
  • Help joiner team members to resolve issues and technical challenges.
  • Drive technical discussions with client architect and team members.
  • Orchestrate the data pipelines in the scheduler via Airflow.

Skills and Qualifications

  • Bachelor's and/or master’s degree in computer science or equivalent experience.
  • Must have total 3+ years of IT experience and 2+ years' experience in Data warehouse/ETL projects.
  • Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects.
  • Hands‑on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors.
  • Deep understanding of Star and Snowflake dimensional modeling.
  • Strong knowledge of Data Management principles.
  • Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture.
  • Should have hands‑on experience in SQL and Spark (PySpark).
  • Experience in building ETL / data warehouse transformation processes.
  • Experience with Open Source non‑relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J).
  • Experience working with structured and unstructured data including imaging & geospatial data.
  • Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT.
  • Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and query optimization.
  • Databricks Certified Data Engineer Associate/Professional Certification (Desirable).
  • Comfortable working in a dynamic, fast‑paced, innovative environment with several ongoing concurrent projects.
  • Should have experience working in Agile methodology.
  • Strong verbal and written communication skills.
  • Strong analytical and problem‑solving skills with a high attention to detail.

Mandatory Skills

  • Snowflake
  • Azure Data Factory

Requirements

  • Must have total 3+ yrs. of IT experience and 2+ years' experience in Data warehouse/ETL projects.
  • Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects.
  • Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors.
  • Deep understanding of Star and Snowflake dimensional modeling.
  • Strong knowledge of Data Management principles
  • Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture
  • Should have hands-on experience in SQL and Spark (PySpark)
  • Experience in building ETL / data warehouse transformation processes
  • Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)
  • Experience working with structured and unstructured data including imaging & geospatial data.
  • Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT.
  • Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization.
  • Databricks Certified Data Engineer Associate/Professional Certification (Desirable).
  • Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects
  • Should have experience working in Agile methodology
  • Strong verbal and written communication skills.
  • Strong analytical and problem-solving skills with a high attention to detail.

Responsibilities

  • Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF.
  • Ability to provide solutions that are forward-thinking in data engineering and analytics space
  • Collaborate with DW/BI leads to understand new ETL pipeline development requirements.
  • Triage issues to find gaps in existing pipelines and fix the issues
  • Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs
  • Help joiner team members to resolve issues and technical challenges.
  • Drive technical discussions with client architect and team members
  • Orchestrate the data pipelines in the scheduler via Airflow

Skills

ADFAirflowCassandraCircleCIDatabricksETLGITMongoDBNeo4JNoSQLPL/SQLPySparkRDBMSSparkSnowflakeSQLTerraformUnix Shell Scripting

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free