T
Senior Manager
Tredence
Rajamahendravaram · On-site Full-time Lead 1w ago
About the role
Location
Bangalore / Chennai / Hyderabad / Gurgaon / Pune / Kolkata
Role Overview
We are seeking a highly skilled Technical Project Manager (TPM) with strong experience in Data Engineering and Databricks platform to lead end-to-end delivery of modern data platform initiatives. The ideal candidate will bridge business and technical teams, manage large-scale data programs, and ensure successful execution of cloud-based data engineering projects.
Key Responsibilities
- Lead and manage end-to-end data engineering projects on Databricks.
- Drive delivery of data platform modernization, migration, and transformation initiatives.
- Collaborate with architects and engineers to design scalable data solutions using Databricks, Spark, Delta Lake, and cloud platforms (AWS/Azure/GCP).
- Define project scope, timelines, budgets, and resource allocation.
- Track progress, manage risks, and ensure adherence to quality and governance standards.
- Facilitate Agile ceremonies (Sprint planning, stand-ups, retrospectives).
- Coordinate with stakeholders including Business, Data Science, Analytics, and Infrastructure teams.
- Ensure implementation of data governance, security, and compliance frameworks.
- Monitor KPIs and provide regular status reports to leadership.
- Drive best practices in CI/CD, DevOps, and DataOps for data engineering workflows.
Technical Skills Required
- Strong hands‑on understanding of:
- Databricks (Workspace setup, cluster management, jobs, workflows)
- Apache Spark (PySpark/Scala)
- Delta Lake architecture
- Data Lake & Lakehouse architecture
- Experience with at least one cloud platform: AWS / Azure / GCP
- Knowledge of:
- Data pipelines (Batch & Streaming)
- ETL/ELT frameworks
- SQL & Python
- Data warehousing concepts
- Familiarity with:
- Airflow / ADF / Glue / Cloud Composer
- Git, CI/CD pipelines
- Data governance & security tools
Project Management Skills
- Strong experience in Agile/Scrum methodology
- Proven track record managing large‑scale data transformation programs
- Stakeholder management & executive reporting
- Budgeting & vendor coordination
- Risk management & mitigation planning
- PMP / PRINCE2 / Scrum Master certification (preferred)
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 14+ years overall experience with at least 5+ years in Technical Project Management.
- Prior hands‑on experience in Data Engineering is highly preferred.
Nice to Have
- Experience in Data Mesh or Lakehouse architecture
- Exposure to ML/AI data pipelines
- Experience in regulated industries (BFSI, Healthcare, etc.)
What We Offer
- Opportunity to lead enterprise‑scale data initiatives
- Exposure to modern cloud data stack
- Collaborative and innovation‑driven environment
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 14+ years overall experience with at least 5+ years in Technical Project Management.
- Prior hands-on experience in Data Engineering is highly preferred.
Responsibilities
- Lead and manage end-to-end data engineering projects on Databricks.
- Drive delivery of data platform modernization, migration, and transformation initiatives.
- Collaborate with architects and engineers to design scalable data solutions using Databricks, Spark, Delta Lake, and cloud platforms (AWS/Azure/GCP).
- Define project scope, timelines, budgets, and resource allocation.
- Track progress, manage risks, and ensure adherence to quality and governance standards.
- Facilitate Agile ceremonies (Sprint planning, stand-ups, retrospectives).
- Coordinate with stakeholders including Business, Data Science, Analytics, and Infrastructure teams.
- Ensure implementation of data governance, security, and compliance frameworks.
- Monitor KPIs and provide regular status reports to leadership.
- Drive best practices in CI/CD, DevOps, and DataOps for data engineering workflows.
Skills
ADFAgileAirflowApache SparkAWSAzureCI/CDCloud ComposerData LakeData pipelinesData warehousingDatabricksDelta LakeDevOpsETLGCPGitGlueSQLSparkScrumPythonDataOpsPySparkScala
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free