Skip to content
mimi

Platform Engineer- work on agentic healthcare products- $200,000-$300,000 base plus bonus!

Saragossa

Hybrid Full-time 1w ago

About the role

About

This is your chance to build the data engine powering one of the most ambitious AI transformations in healthcare.

You’ll be joining a team creating an AI-native revenue operating system that doesn’t just analyze data, it reasons over entire medical records, payer logic, and financial workflows to automate how healthcare gets paid. This platform is already live at national scale, embedded in the daily operations of leading health systems, processing hundreds of millions of patient encounters and billions of workflow actions every year. The impact is real, measurable, and only just getting started.

Responsibilities

  • You’ll own the data foundations that make it all possible by designing and building the backend schemas, APIs, and pipelines that power both cutting‑edge AI systems and production user applications.
  • This is hands‑on, high‑ownership engineering where your work directly shapes how intelligent systems operate in the real world.

Requirements

  • Experience building production‑grade data systems behind ML platforms or SaaS products.
  • Comfortable designing microservices and ETL pipelines using languages like Python, Go, or Java.
  • Experience with modern infrastructure (Terraform, Docker, Kubernetes).
  • Know how to orchestrate and scale data flows using tools like Spark, Airflow, or Kafka.
  • Experience with healthcare data standards is a plus, but not required.

Location

  • Based in New York and be hybrid with 3 days onsite.

Application

  • No up‑to‑date resume required. Apply now!

Requirements

  • Experience building production-grade data systems behind ML platforms or SaaS products
  • Comfortable designing microservices and ETL pipelines using languages like Python, Go, or Java
  • Worked with modern infrastructure (Terraform, Docker, Kubernetes)
  • Know how to orchestrate and scale data flows using tools like Spark, Airflow, or Kafka

Responsibilities

  • Own the data foundations by designing and building the backend schemas, APIs, and pipelines that power both cutting-edge AI systems and production user applications.

Skills

AirflowDockerGoJavaKafkaKubernetesPythonSparkTerraform

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free