E
Senior Databricks
EazyML
Remote · India Full-time Senior Today
About the role
About
- Eazy ML, Recognized by Gartner, Eazy ML (www.EazyML.com) specializes in Responsible AI.
- Our solutions facilitate proactive compliance and sustainable automation and The company is associated with breakthrough startups like Amelia.ai.
Role Overview
- This is a full-time Remote role for a Senior Azure Fabric /Databricks Engineer with experience in Microsoft Fabric.
- The person can work from home (anywhere in INDIA, job location is in India).
- We’re hiring a Senior Azure Fabric Data Engineer with Experience supporting ETL and ELT development and working with Fabric data engineering capabilities to design and lead scalable, cloud-native data platforms.
Responsibilities
- In this role, you will architect and build high-performance data engineering solutions , develop robust ETL/ELT pipelines , and lead initiatives across data architecture, feature engineering, and AI-driven data workflows .
- You will work extensively with Databricks, Spark, Delta Lake, Airflow, and Python , while applying modern Gen AI tools to improve developer productivity and data quality.
- This role emphasizes architecture, governance, and best practices , with hands-on experience with data engineering.
Requirements
- Experience with Azure Fabric, Databricks, Azure Onelake, Spark, and Delta Lake, data architecture, azure Data bricks, Azure Synapse, Data bricks, Pyspark, SQL, Delta Lake concepts, Power BI, Azure Data Engineering tech stack, pipeline optimization data migrations, Azure Data Factory (ADF), Python, SQL.
- ETL/ELT strategy, and orchestration , building high-performance pipelines in Python and SQL , and partnering with ML teams.
Key Skills
- Microsoft Fabric.
- Databricks • Spark • Delta Lake • Data Architecture • ETL/ELT • Python • SQL • Azure • LLMs/NLP.
- Azure fabric, azure onelake, azure data bricks..pipeline optimization data migrations, tarquet file format….
- Azure Data Factory (ADF), Pyspark, Python, SQL, Azure Synapse.
Experience
- 7+ years | CS/IT degree preferred
Requirements
- Experience supporting ETL and ELT development
- Working with Fabric data engineering capabilities to design and lead scalable, cloud-native data platforms
- Hands-on experience with data engineering
Responsibilities
- Architect and build high-performance data engineering solutions
- Develop robust ETL/ELT pipelines
- Lead initiatives across data architecture, feature engineering, and AI-driven data workflows
- Work extensively with Databricks, Spark, Delta Lake, Airflow, and Python
- Apply modern Gen AI tools to improve developer productivity and data quality
Skills
ADFAirflowAzureAzure Data FactoryAzure Data LakeAzure FabricAzure SynapseDatabricksDelta LakeETLELTLLMs/NLPMicrosoft FabricPythonPysparkSparkSQL
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free