Skip to content
mimi

Lead Data Engineer

LinkedIn

Aligarh · On-site Full-time Lead 2w ago

About the role

It looks like you’ve pasted a detailed job description for a senior GCP Data Engineer with SAP integration expertise. How can I assist you with this?

  • Resume / CV – I can help you tailor a resume that highlights the exact skills, experiences, and achievements the role is looking for.
  • Cover letter – I can draft a compelling cover letter that ties your background to each of the key responsibilities and preferred skills.
  • Interview preparation – I can provide a list of likely technical and behavioral interview questions (e.g., BigQuery performance tuning, SAP SLT data extraction, Cloud Composer orchestration) along with strong answer frameworks.
  • Technical sample – If you need a code snippet, architecture diagram, or a short design document (e.g., end‑to‑end SAP → GCP pipeline using Dataflow & BigQuery), I can create one.
  • Project proposal – I can outline a high‑level migration or data‑lake implementation plan that demonstrates the architecture, governance, cost‑optimization, and CI/CD approach the role expects.

Just let me know which deliverable (or combination of them) you’d like, and I’ll get started right away!

Requirements

  • Google Cloud Platform (GCP) Data Engineer with SAP data integration expertise
  • Deep expertise in cloud data platforms, data governance, security, and data modeling
  • Hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics
  • Collaboration with business stakeholders and engineering teams is essential
  • 8-10 years of proven experience with GCP BigQuery, Composer, Cloud Storage, Pub/Sub, Dataflow
  • Strong SQL and Python programming skills
  • Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems
  • Knowledge of data governance frameworks and security best practices
  • Familiarity with DevOps tools for data
  • Understanding of Google Cortex Framework for SAP-GCP integrations

Responsibilities

  • Lead and mentor a team of data engineers in building ETL/ELT pipelines for SAP and other ERP sources into GCP
  • Set engineering standards, best practices, and coding guidelines
  • Provide technical direction, code reviews, and support for complex data solutions
  • Collaborate with project managers, provide estimates, track progress, and remove roadblocks to ensure timely completion of work
  • Collaborate with BI teams and Data analysts to enable reporting solutions
  • Design conceptual, logical, and physical data models to support analytics and operational workloads
  • Implement star, snowflake, and data vault models for analytical systems
  • Design data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc
  • Implement cost optimization strategies for GCP workloads
  • Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow
  • Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT or Google Cortex Framework
  • Leverage integration tools such as Boomi for system interoperability
  • Develop complex SQL queries for analytics, transformations, and performance tuning
  • Build automation scripts and utilities in Python
  • Lead on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
  • Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime
  • Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform
  • Apply infrastructure-as-code principles for reproducible and scalable deployments
  • Design and develop conceptual, logical, and physical data models for enterprise systems
  • Translate business requirements into data entities, attributes, relationships, and constraints
  • Build and maintain dimensional models (Star/Snowflake schema) for data warehouses and BI reporting
  • Develop data models for data lake/lakehouse environments (BigQuery, Snowflake, Azure Synapse, Databricks)
  • Define and document data standards, naming conventions, and data definitions
  • Collaborate with Data Engineering teams to ensure models are implemented accurately in ETL/ELT pipelines
  • Work with BI teams to optimize models for reporting tools such as Power BI, Tableau, SAP BW, etc.
  • Support integration across multiple source systems (SAP, Salesforce, Oracle, etc.)
  • Ensure data models comply with data governance, security, and compliance requirements
  • Create and maintain documentation including ERDs, data dictionaries, and lineage diagrams

Skills

Apache AirflowBigQueryBoomiCloud BuildCloud ComposerCloud StorageDatabricksDataflowDataprocGitHub ActionsGoogle Cortex FrameworkGCPOraclePower BIPythonPub/SubSalesforceSAPSAP BWSAP HANASAP SLTSnowflakeSQLTableauTerraformAzure Synapse

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free