Skip to content
mimi

comparis.ch: Data Engineer, 80-100% (f/m/x), remote

comparis.ch

Remote (Global) Lead 3d ago

About the role

About

  • Headquarters: Zürich
  • Website: https://en.comparis.ch/
  • Remote work: Fully remote position. Applicants should be within a maximum time‑zone difference of ± 2 hours from Switzerland (GMT +2).

Responsibilities

What you will do

  • Design, implement and continuously optimize platform services on our modern industry‑standard GCP‑based data environment.
  • Interact with software architects/engineers, data scientists and business analysts in an agile environment to define data needs & provide solutions.
  • Develop a high‑quality code base for our data pipelines.
  • Develop, provision and monitor our data products.
  • Maintain our data pipeline orchestration, datalake and data warehouse.
  • Contribute to own data & engineering initiatives.

Requirements

What we expect from you

  • Knowledgeable and experienced in SQL programming and dbt.
  • At least 3 years of experience in a similar data science/engineering environment with a strong track record of data projects on Google Cloud Platform (GCP).
  • Experience with GCP native services such as BigQuery, Cloud Run, Dataproc, and Dataflow (highly appreciated).
  • Experience developing high‑quality software, including unit & integration tests, in one or more languages (e.g., Python or Java), while leveraging CI/CD tools and Git.
  • Experience managing data pipeline workflows with Airflow.
  • Experience with containerization (Docker) and infrastructure‑as‑code frameworks (Terraform).
  • Experience with distributed computing technologies like Apache Spark and knowledge of ML concepts/frameworks (strong plus).
  • Openness to new technologies and challenges.

Benefits

In return, here is what you can expect from us

  • Opportunity to join – and shape – a growing Data Team of diverse, highly motivated experts.
  • Work on AI‑driven data products to deliver a highly relevant customer experience.
  • Provide tools and knowledge across the organization for better, more informed decisions.
  • Agile, interdisciplinary teams that highly value autonomy.
  • Position can be filled remotely.

Application

We look forward to receiving your application!

Requirements

  • Knowledgeable and experienced in SQL programming and dbt
  • Experience in developing high-quality software incl. unit & integration tests in one or more languages (such as Python or Java) while leveraging CI/CD tools and Git
  • Experience with managing data pipeline workflows with Airflow
  • Experience working with containerization (Docker) and infrastructure as a code frameworks (Terraform)

Responsibilities

  • Design, implement and continuously optimize platform services on our modern industry standard GCP based data environment
  • Interacting with our software architects/engineers, data scientists and business analysts in an agile environment to define data needs & provide solutions
  • Developing a high-quality code base for our data pipelines
  • Development, provisioning and monitoring of our data products
  • Maintenance of our data pipeline orchestration, datalake and data warehouse
  • Room for own data & engineering initiatives

Skills

AirflowBigQueryCloud RunDataprocDataflowDockerGCPGitJavaMLPythonSQLSparkTerraformdbt

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free