Skip to content
mimi

Senior Data and Analytics Engineering Manager

Shout! Factory

Los Angeles · Hybrid Full-time Senior $150k – $175k/yr 6d ago

About the role

Position Overview

Radial Entertainment, the largest independent, ad-supported streaming network and film/TV distributor worldwide, is seeking a skilled and ambitious Sr. Data and Analytics Engineering Manager. This role is pivotal in expanding our modern, extensible platform of data pipelines, orchestrations, and analytics infrastructure. You will blend technical leadership with hands‑on development—driving scalable data models, high‑quality transformations, and observability tooling across our cloud‑native stack while collaborating closely with BI, data science, and business teams.

Reporting to the Director of Data Product, you’ll integrate state‑of‑the‑art AI developer tools and best practices into our workflows—ensuring the team builds smarter and faster.

How to Apply: Email a cover letter and resume to jobs@radialentertainment.com.

Key Responsibilities

  • Manage and grow our data/analytics engineering function as both a day‑to‑day technical lead and a people manager.
  • Collaborate with diverse teams to onboard data and support an expanding suite of reporting, advanced analytics, and operational capabilities.
  • Evolve the Data Warehouse schema in line with best practices for analytic and operational use‑cases.
  • Maintain and expand core ELT orchestrations with a focus on enhancing data observability and controls throughout our modern data stack.
  • Implement automated tests and QA processes to ensure quality and reliability of all contributions.
  • Collaborate with data science stakeholders to ensure robust feature engineering and selection for machine learning methodologies.
  • Support the broader development team by championing best practices for efficient DevOps.

Required Qualifications

  • 4‑7 years of relevant experience in data‑intensive applications, particularly with high‑volume streaming data, audience analytics, and ads data.
  • 3+ years as an individual contributor managing batch and streaming pipelines and orchestrations in data warehousing or data science settings.
  • Proficiency in a general‑purpose programming language (preferably Python), along with Bash scripting and infrastructure configuration.
  • Strong experience with Cloud Data tooling (AWS preferred) and containerized workflows (ECS, EKS).
  • Solid foundations in data modeling techniques (dimensional, object, app access patterns) and transformations (DBT preferred).
  • Expert SQL skills for writing and tuning complex analytic queries against MPP databases (Snowflake).
  • Familiarity with automated testing frameworks, CI/CD processes, and agile development practices.
  • Ability to balance technical leadership with hands‑on feature development, and to translate strategic directives into clear specifications.
  • Strong documentation skills (ERDs, UML, docstrings) and effective communication with technical and non‑technical stakeholders.
  • Experience with AI‑augmented development workflows and generative AI infrastructure.

Preferred Qualifications

  • Experience with serverless tooling such as AWS Lambda, Fargate, Glue (PySpark), Serverless Framework, and AWS Chalice.
  • Proven ability deploying and orchestrating containers using Docker/Kubernetes and AWS ECS/Fargate.
  • Hands‑on knowledge of CI/CD tools (GitLab, GitHub Actions, Jenkins) and Infrastructure as Code (Terraform).
  • Familiarity with orchestration tools like Prefect, Airflow, or similar.
  • Deep experience with DBT and big data ecosystems (PySpark, HDFS, columnar file formats such as Parquet and ORC).
  • Proficiency with AI developer tools (Cursor, Claude Code, GitHub Co‑Pilot).
  • Experience with machine learning frameworks (TensorFlow, Keras, Scikit‑Learn, SageMaker) and web app development (Streamlit, Flask, React, Django).

Benefits & Perks

  • Competitive Compensation: $150k – $175k DOE
  • Full Benefits Package: 401(k) with match, medical, dental, vision, and more
  • Paid Time Off: Holidays, vacation, paid sick leave, personal days
  • Work Location: Remote (must reside in Los Angeles or New York), with an optional hybrid option for LA‑based candidates

Requirements

  • 4‑7 years of relevant experience in data‑intensive applications, particularly with high‑volume streaming data, audience analytics, and ads data.
  • 3+ years as an individual contributor managing batch and streaming pipelines and orchestrations in data warehousing or data science settings.
  • Proficiency in a general-purpose programming language (preferably Python), along with Bash scripting and infrastructure configuration.
  • Strong experience with Cloud Data tooling (AWS preferred) and containerized workflows (ECS, EKS).
  • Solid foundations in data modeling techniques (dimensional, object, app access patterns) and transformations (DBT preferred).
  • Expert SQL skills for writing and tuning complex analytic queries against MPP databases (Snowflake).
  • Familiarity with automated testing frameworks, CI/CD processes, and agile development practices.
  • Ability to balance technical leadership with hands‑on feature development, and to translate strategic directives into clear specifications.
  • Strong documentation skills (ERDs, UML, docstrings) and effective communication with technical and non‑technical stakeholders.
  • Experience with AI‑augmented development workflows and generative AI infrastructure.

Responsibilities

  • Manage and grow our data/analytics engineering function as both a day-to-day technical lead and a people manager.
  • Collaborate with diverse teams to onboard data and support an expanding suite of reporting, advanced analytics, and operational capabilities.
  • Evolve the Data Warehouse schema in line with best practices for analytic and operational use-cases.
  • Maintain and expand core ELT orchestrations with a focus on enhancing data observability and controls throughout our modern data stack.
  • Implement automated tests and QA processes to ensure quality and reliability of all contributions.
  • Collaborate with data science stakeholders to ensure robust feature engineering and selection for machine learning methodologies.
  • Support the broader development team by championing best practices for efficient DevOps.

Benefits

401(K) with matchmedical insurancedental insurancevision insurance

Skills

AWSAWS ChaliceAWS CloudFormationAWS GlueAWS LambdaAWS SageMakerBashCI/CDDockerDjangoDBTECSEKSFlaskGitLabGitHub ActionsHDFSJenkinsKubernetesMPP databasesParquetPrefectPythonReactSQLSageMakerScikit-LearnServerless FrameworkSnowflakeStreamlitTerraformTensorFlowUMLVector DatabasesWeb app development

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free