Skip to content
mimi

Senior Data Engineer (GCP)

Mobile Programming

Remote · India Full-time Senior 4d ago

About the role

Candidate Skill

  • GCP
  • BigQuery
  • Dataflow
  • Pub/Sub
  • SQL
  • Python
  • ETL
  • Data Engineering

Experience

  • 7+ Years

Location

  • City: MP Office/ Remote
  • Country: India

Job Description

We are looking for a Senior Data Engineer with strong expertise in Google Cloud Platform (GCP) to design, build, and optimize scalable data pipelines and data infrastructure. The ideal candidate should have hands-on experience with big data technologies and cloud-based data solutions.

Key Responsibilities

  • Design and develop scalable data pipelines on GCP
  • Build and maintain ETL/ELT processes for large datasets
  • Work with BigQuery, Dataflow, Pub/Sub, Cloud Storage
  • Optimize data processing performance and cost efficiency
  • Collaborate with data scientists, analysts, and cross-functional teams
  • Ensure data quality, governance, and security
  • Troubleshoot and resolve data-related issues

Required Skills

  • Strong experience in GCP (BigQuery, Dataflow, Pub/Sub)
  • Expertise in SQL and Python / Scala
  • Hands-on with ETL tools and data pipeline development
  • Experience with Apache Spark / Beam
  • Knowledge of data warehousing concepts
  • Familiarity with CI/CD and version control (Git)

Good to Have

  • GCP certifications (Professional Data Engineer)
  • Experience with Airflow / Composer
  • Knowledge of real-time data streaming
  • Exposure to Data Lakes and Lakehouse architecture

Requirements

  • Strong experience in GCP (BigQuery, Dataflow, Pub/Sub)
  • Expertise in SQL and Python / Scala
  • Hands-on with ETL tools and data pipeline development
  • Experience with Apache Spark / Beam
  • Knowledge of data warehousing concepts
  • Familiarity with CI/CD and version control (Git)

Responsibilities

  • Design and develop scalable data pipelines on GCP
  • Build and maintain ETL/ELT processes for large datasets
  • Work with BigQuery, Dataflow, Pub/Sub, Cloud Storage
  • Optimize data processing performance and cost efficiency
  • Collaborate with data scientists, analysts, and cross-functional teams
  • Ensure data quality, governance, and security
  • Troubleshoot and resolve data-related issues

Skills

BigQueryCloud StorageDataflowData EngineeringETLGCPGitPub/SubPythonSQL

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free