Skip to content
mimi

Machine Learning Engineer- AI Data Platform (Minneapolis, MN)

MOBE, LLC

Minneapolis · On-site Full-time Today

About the role

Company Overview

MOBE helps people discover new ways to live healthier. We are the whole-person, cross-condition solution that goes further to deliver better health and lower overall costs through evidence-based individual health guidance and pharmacist-led medication management. We empower individuals to make meaningful changes that improve their health and overall well‑being. Behind our innovative solutions are robust data analytics, digital application, and a uniquely human philosophy. With one‑to‑one connection and compassion, we uncover opportunities, overcome challenges, and motivate people to transform their lives.

At MOBE our team is our most significant asset. We cultivate a culture grounded in curiosity, innovation, and growth. We encourage new ideas, fresh solutions, and meaningful impact. We value a workforce made up of people with differences who are eager to learn from each other and grow personally and professionally. We extend this approach to our partners and communities, seeking to increase understanding and expand opportunities across all groups.

Your Role at MOBE

We are seeking a highly skilled AI Engineer to serve as a core builder of our AI Data Platform. This role sits at the intersection of machine learning engineering, data platform development, and business intelligence, with responsibility for designing and operating the infrastructure that powers AI‑driven insights across the organization.

You will build intelligent data pipelines, production‑grade ML systems, and AI‑enabled features that translate complex data into actionable outcomes. This role is ideal for an engineer who enjoys working end‑to‑end from data ingestion and feature engineering to model deployment and downstream consumption in analytics and BI tools.

  • Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time.

Responsibilities

  • Build AI‑first data pipelines: Design, implement, and maintain scalable data pipelines that support model training, inference, and analytics use cases across the AI Data Platform.
  • Deploy production ML systems: Develop, deploy, and monitor machine learning models using AWS SageMaker, ensuring reliability, observability, and performance in production environments.
  • Implement Retrieval‑Augmented Generation (RAG): Architect and maintain RAG‑based systems that combine structured and unstructured data to power AI‑driven insights and applications.
  • Operationalize ML lifecycle management: Use MLflow for experiment tracking, model versioning, and lifecycle management to support reproducibility and continuous improvement.
  • Design feature infrastructure: Build and manage feature stores (e.g., Feast, Tecton, or SageMaker Feature Store) to ensure consistent, reusable features across training and inference.
  • Orchestrate complex workflows: Create and manage Apache Airflow DAGs to orchestrate data transformations, model pipelines, and AI workflows with clear dependencies and monitoring.
  • Enable analytics consumption: Partner with BI and analytics teams to ensure ML outputs integrate cleanly with our internal BI reporting hub.
  • Translate business questions into AI solutions: Collaborate with stakeholders to convert ambiguous business problems into measurable ML‑ and data‑driven solutions.
  • Uphold data quality and governance: Ensure AI pipelines and models adhere to data governance, security, and quality standards, particularly when handling sensitive data.
  • Collaborate cross‑functionally: Work closely with Data Science, Analytics Engineering, Medical Economics, and DataOps to align AI platform capabilities with business priorities.

Requirements

  • This role is ideal for an engineer who enjoys working end-to-end from data ingestion and feature engineering to model deployment and downstream consumption in analytics and BI tools
  • *Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time

Responsibilities

  • This role sits at the intersection of machine learning engineering, data platform development, and business intelligence, with responsibility for designing and operating the infrastructure that powers AI-driven insights across the organization
  • You will build intelligent data pipelines, production-grade ML systems, and AI-enabled features that translate complex data into actionable outcomes
  • Build AI-first data pipelines: Design, implement, and maintain scalable data pipelines that support model training, inference, and analytics use cases across the AI Data Platform
  • Deploy production ML systems: Develop, deploy, and monitor machine learning models using AWS SageMaker, ensuring reliability, observability, and performance in production environments
  • Implement Retrieval-Augmented Generation (RAG): Architect and maintain RAG-based systems that combine structured and unstructured data to power AI-driven insights and applications
  • Operationalize ML lifecycle management: Use MLflow for experiment tracking, model versioning, and lifecycle management to support reproducibility and continuous improvement
  • Design feature infrastructure: Build and manage feature stores (e.g., Feast, Tecton, or SageMaker Feature Store) to ensure consistent, reusable features across training and inference
  • Orchestrate complex workflows: Create and manage Apache Airflow DAGs to orchestrate data transformations, model pipelines, and AI workflows with clear dependencies and monitoring
  • Enable analytics consumption: Partner with BI and analytics teams to ensure ML outputs integrate cleanly with our internal BI reporting hub
  • Translate business questions into AI solutions: Collaborate with stakeholders to convert ambiguous business problems into measurable ML- and data-driven solutions
  • Uphold data quality and governance: Ensure AI pipelines and models adhere to data governance, security, and quality standards, particularly when handling sensitive data
  • Collaborate cross-functionally: Work closely with Data Science, Analytics Engineering, Medical Economics, and DataOps to align AI platform capabilities with business priorities

Skills

AWS SageMakerApache AirflowFeastMLflowRAGTecton

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free