Skip to content
mimi

Principal Cloud Data Engineer

SII Group India

India · On-site Full-time Lead 1w ago

About the role

DATA ENGINEER • EXPERIENCE - 5-8 yrs • LOCATION - NOIDA • WORK MODE - HYBRID

Job Description:

Key Responsibilities: • Data Pipeline Development: Design, build, and optimize scalable ETL/ELT pipelines to process and transform data from various sources into Snowflake. • Data Modeling: Build and maintain data models within Snowflake for optimized querying and reporting. • Cloud Infrastructure: Leverage AWS services (e.G., S3, Redshift, Lambda, Glue) for data storage, processing, and orchestration. • Automation & Infrastructure as Code: Use Terraform to automate and manage cloud infrastructure deployments and ensure scalability, reliability, and efficiency. • Reporting & Visualization: Collaborate with BI teams to integrate data with Tableau for reporting, dashboards, and analytics. • Data Quality & Governance: Implement best practices for data quality, governance, and security in line with company policies. • Performance Optimization: Continuously monitor and improve the performance of data systems and pipelines, ensuring low-latency and high-availability. • Collaboration: Work closely with cross-functional teams (data scientists, analysts, product managers) to deliver actionable insights and products. • Troubleshooting & Support: Provide ongoing support to ensure that data systems and pipelines are running smoothly and addressing issues as they arise.

Skills & Qualifications: • Experience with Snowflake: Proficiency in Snowflake for data warehousing, including data loading, transformation, and optimization. • AWS Expertise: Hands-on experience with AWS tools such as S3, Redshift, Lambda, Glue, and others for data processing and storage. • Data Pipeline Development: Experience building and maintaining end-to-end data pipelines using tools like Apache Airflow, DBT, or similar. • Tableau: Solid experience in integrating and visualizing data in Tableau for reporting and dashboard creation. • Terraform: Experience in Infrastructure as Code (IaC) using Terraform to manage cloud resources. • SQL Proficiency: Strong SQL skills for data querying, transformation, and troubleshooting. • Programming: Familiarity with Python or other programming languages for building custom data pipelines and automation. • Data Governance & Security: Understanding of data governance principles, security best practices, and compliance requirements. • Communication: Strong communication skills to collaborate with technical and non-technical teams.

Nice to Have: • Experience with containerization (Docker, Kubernetes). • Knowledge of machine learning models and integration into data pipelines. • Agile or Scrum methodology experience. • Familiarity with CI/CD processes for data engineering workflows.

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free