Technical Product Manager - Data Engineering F/M (Hybrid)
Betclic Group
About the role
ENTER THE GAME 🎮
As our first Technical Product Manager for Data Engineering, you will own the product vision and roadmaps for two transverse teams at the heart of Betclic's data transformation:
- Data Platform Engineering — DataOps, DevOps Data, infrastructure-as-code for the data stack, CI/CD pipelines, observability, platform reliability, and product analytics data pipelines (Amplitude).
- Data Core — the engineering centre of excellence embedded across domain squads: technical mentoring, cross‑cutting standards, tooling, monitoring, and technology watch.
This is a greenfield role: you'll be defining what great product management looks
YOUR ROLE WITHIN BETCLIC 🔥
Owning and driving the product roadmap
Define, prioritize, and deliver clear roadmaps for Data Platform Engineering and Data Core. You will translate technical challenges, operational needs, and strategic initiatives into actionable plans, balancing short‑term priorities with long‑term platform vision. You will run regular syncs with domain squad Product Owners and Engineering Managers to surface dependencies, unblock delivery, and ensure alignment across the organization.Leading the Data Platform Engineering product
Drive the evolution of our data infrastructure and tooling — DataOps, CI/CD, data quality, environment management, and cost governance. You will own the pipelines feeding product analytics tools (Amplitude), ensuring reliable and well‑documented event flows. You will collaborate with FinOps initiatives to optimize cloud costs across the full data stack: Snowflake, AWS, and emerging tools such as ClickHouse and Astro.Driving the Data Core mission
Ensure the Data Core team stays 100 % focused on its transverse mission: technical support to domain squads, continuous improvement of data flows (drift detection, root‑cause analysis), technology watch and POCs, definition and dissemination of engineering guidelines, developer tooling and DX improvements, and robust monitoring and observability across the platform.Contributing to Betclic's strategic transformation (Tech 2026)
Actively support key initiatives such as Data Mesh adoption, FinOps (cloud cost control across Snowflake, AWS, ClickHouse, Astro…), AI‑driven data engineering (LangGraph, Bedrock‑powered automation agents), and data governance (Snowflake access control, security posture, programmatic authentication).Facilitating alignment and scaling collaboration
Lead product rituals (roadmap reviews, planning sessions, OKRs), ensure clear and impactful communication, and represent Data Platform & Core in cross‑functional discussions. As the organization grows, you will help structure and scale the product scope — with a natural split into dedicated PMs per team anticipated.
WHO ARE WE LOOKING FOR? 🔍
We're looking for a rare blend: a product‑minded engineer or engineering‑minded PM who thrives at the intersection of platform thinking and data engineering.
This role is for you if:
- You have 5 + years of experience, including significant time as a PM, PO, or tech lead in a data or platform engineering context
- You're comfortable reading and reasoning about modern data architectures — data lakehouse, streaming pipelines, data mesh, orchestration — even if you're not hands‑on with every tool
- You have worked closely with or within data engineering teams (Snowflake, dbt, Spark/Flink, Airflow, or equivalents)
- You know how to structure a technical backlog, run discovery with engineers, and turn complex infrastructure needs into clear outcomes
- You're a strong communicator who can talk architecture with Staff Engineers and roadmap priorities with senior leadership
- You bring rigor, autonomy, and a continuous improvement mindset — you measure impact, not just delivery
- You speak English fluently (our working language in tech)
Bonus points:
- Experience in a data mesh or platform‑as‑a‑product transformation
- Familiarity with FinOps principles applied to cloud data platforms
- Exposure to AI/ML engineering or agentic systems
WHAT ARE THE RECRUITMENT STEPS? 📝
If shortlisted, you'll be contacted within one week for an initial HR screening by Joe (30 min), followed by an AssessFirst assessment (personality, motivation, cognitive reasoning).
The process then unfolds in four steps:
- Interview with the Director of Data Engineering — scope deep‑dive, vision alignment, and mutual fit
- Interview with the Head of Engineering, Data Platform — technical context, team dynamics, ways of working
- Case study presentation — you'll be given a realistic backlog of Data Platform and Data Core items (FinOps, monitoring, data mesh enablement, developer tooling, governance…) and asked to build and present a prioritized 2‑quarter roadmap, including your methodology, trade‑offs, and how you'd communicate it to engineering leads and domain squads
- Final HR interview — review of AssessFirst results with the Talent team
The full process typically takes 4 to 6 weeks.
WHAT CAN YOU EXPECT? 🎁
- ✅ 25 days of paid leave and 10 RTT days
- 🍽️ A Ticket Restaurant® card credited with €11 per day (€6 per day funded by Betclic)
- 🩺 100 % health insurance coverage for you and your children
- 🚆 50 % reimbursement of public transport costs or an annual sustainable mobility allowance (€230 for commuting with sustainable transport)
- 🏡 Hybrid work model
- 📚 Access to a vast training catalog, with opportunities for professional development every year
- 🏢 Extraordinary office spaces with a rooftop where you can enjoy sunny breaks with a view of the Cité du Vin
- 🎉 Internal events to liven up your daily life
- 🏋️ On‑site sports classes and organized tournaments (Pilates, circuit training, boxing, yoga, futsal, padel, tennis…)
Requirements
- 5+ years of experience, including significant time as a PM, PO, or tech lead in a data or platform engineering context
- Comfortable reading and reasoning about modern data architectures — data lakehouse, streaming pipelines, data mesh, orchestration
- Worked closely with or within data engineering teams (Snowflake, dbt, Spark/Flink, Airflow, or equivalents)
- Know how to structure a technical backlog, run discovery with engineers, and turn complex infrastructure needs into clear outcomes
- Strong communicator who can talk architecture with Staff Engineers and roadmap priorities with senior leadership
- Bring rigor, autonomy, and a continuous improvement mindset
- Measure impact, not just delivery
- Speak English fluently
Responsibilities
- Owning and driving the product roadmap
- Define, prioritize, and deliver clear roadmaps for Data Platform Engineering and Data Core
- Translate technical challenges, operational needs, and strategic initiatives into actionable plans
- Run regular syncs with domain squad Product Owners and Engineering Managers to surface dependencies, unblock delivery, and ensure alignment across the organization
- Drive the evolution of our data infrastructure and tooling — DataOps, CI/CD, data quality, environment management, and cost governance
- Own the pipelines feeding product analytics tools (Amplitude), ensuring reliable and well-documented event flows
- Collaborate with FinOps initiatives to optimize cloud costs across the full data stack
- Ensure the Data Core team stays 100% focused on its transverse mission
- Actively support key initiatives such as Data Mesh adoption, FinOps, AI-driven data engineering, and data governance
- Lead product rituals (roadmap reviews, planning sessions, OKRs)
- Ensure clear and impactful communication, and represent Data Platform & Core in cross-functional discussions
- Help structure and scale the product scope
Benefits
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free