Skip to content
mimi

Senior Software Engineer, AI Data

Jobgether

Remote · France Contract Senior Today

About the role

This role offers the opportunity to design and build scalable, high-performance systems that power next-generation AI data platforms. You will work on mission-critical pipelines that support large-scale model training and evaluation, impacting millions of inference calls and hours of processed data. The role combines software engineering rigor with AI-focused infrastructure, providing a chance to shape technical execution, optimize data workflows, and drive innovation in a fast-paced, high-impact environment. You will collaborate closely with researchers, platform engineers, and other stakeholders to deliver reliable, maintainable, and cost-efficient systems that accelerate AI model development. This is an ideal position for engineers who thrive in a startup-like culture where ownership, technical excellence, and measurable impact are paramount.

Accountabilities

  • Architect and implement scalable AI data infrastructure to support model training and evaluation at scale
  • Build efficient, self-serve data processing pipelines leveraging cloud services and distributed systems
  • Design cost-effective storage, monitoring, and resource management solutions to maximize efficiency
  • Lead adoption of cutting-edge ML/AI tools and frameworks to enhance team velocity and system reliability
  • Streamline workflows, introduce new tooling, and maintain high-quality documentation for engineering processes
  • Troubleshoot and resolve complex technical issues while improving system performance, quality, and cost-efficiency
  • Participate in on-call rotations to ensure operational reliability of AI data platforms

Requirements

  • 5+ years of professional software engineering experience with strong Python and SQL proficiency
  • Solid understanding of software engineering fundamentals: data structures, algorithms, system design, architectural patterns, and testing strategies
  • Experience with RESTful APIs, distributed systems, and containerization (Docker) in cloud environments
  • Proven ability to deliver high-quality, maintainable code in collaborative team settings
  • Strong communication and stakeholder management skills, with the ability to explain technical concepts clearly
  • Startup mindset: able to navigate changing priorities, rapid iteration, and pragmatic decision-making

Preferred Qualifications

  • Experience with GCP services (BigQuery, GCS, Cloud Run, GKE)
  • Familiarity with distributed processing frameworks (Apache Beam, PySpark)
  • Knowledge of workflow orchestration tools (Airflow, Prefect, Dagster)
  • Background in ML/AI infrastructure, monitoring tools (Datadog), or data engineering roles
  • Experience collaborating directly with researchers

Benefits

  • Competitive salary with equity grants and location-adjusted compensation
  • Fully remote work with flexible hours and autonomy over work-life balance
  • Comprehensive employer-paid health benefits
  • Access to cutting-edge AI tools and frameworks, fostering skill growth and innovation
  • Collaborative, high-impact environment with opportunities to shape technical strategy
  • Professional development opportunities including mentorship, training, and learning resources

Skills

AlgorithmsApache BeamArchitectural PatternsBigQueryCloud RunDatadogDockerGCPGCSGKEML/AIPrefectPySparkPythonRESTful APIsSQLSystem DesignTesting StrategiesWorkflow Orchestration

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free