Skip to content
mimi

Data Engineer

APEC - Offres Gratuites

Pusignan · Hybrid Full-time Mid Level €48k – €50k/yr Today

About the role

About

Externatic accompanies an international group specialized in the organization of national and international freight transport, present in over 80 sites in France and 160 countries. With a turnover close to one billion euros and growth driven by strategic acquisitions, this group is pursuing an ambitious development trajectory.

At the heart of this dynamic, the Data Department plays a key role in structuring and valuing data to support Supply Chain and logistics activities. Reporting to the Chief Data Officer, you will join a multidisciplinary team composed of Data Engineers, Data Analysts, and Data Stewards.

Your mission is to design and evolve robust data pipelines and strategic indicators (descriptive and predictive) to optimize the group's operational and decision-making performance.

Within a structured and industrialized AWS environment, you will be involved in the entire data lifecycle:

Responsibilities

  • Deploy and maintain batch data pipelines, with a view to evolving towards streaming flows, ensuring performance, reliability, and traceability.
  • Implement DataOps / DevOps practices: version management, CI/CD, multi-environment automation via Infrastructure as Code (CloudFormation, CDK, Terraform or equivalent).
  • Structure and optimize the Data Lake / Data Warehouse: modeling, partitioning, format management, data cataloging, and governance.
  • Industrialize the processes for ingesting, transforming, and securely storing sensitive data, in compliance with quality and conformity requirements.
  • Build and maintain datasets for Business Intelligence tools (Qlik Sense) and contribute to defining business KPIs (historical data, near real-time).
  • Support operational teams in their analyses: data exploration, understanding Supply Chain processes, and designing high-value-added indicators.
  • Depending on your interest, participate in generative AI projects on AWS (LLM, RAG, intelligent agents) to enhance analytical and decision-making support capabilities.
  • Contribute to the dissemination of Data & BI best practices: documentation, standardization, experience sharing, and team upskilling.

Qualifications

  • Higher education in IT or data (engineering school, master's degree, or equivalent).
  • Significant experience in Data Engineering within a cloud environment, ideally AWS.
  • Solid proficiency in Python / PySpark.
  • Good knowledge of Data Lake / Data Warehouse architectures.
  • Experience with AWS environments (or equivalent cloud).
  • Practice with CI/CD tools and version management.
  • Experience in Infrastructure as Code (CloudFormation, CDK, Terraform, or comparable technologies).
  • Awareness of data performance, quality, and security challenges.
  • Experience in feeding decision-making tools, particularly Qlik Sense (or similar BI solution).
  • Talend (feeding databases, creating web services/APIs, using semi-structured external sources).
  • Ability to communicate with business stakeholders and translate operational needs into relevant indicators.

A first exposure to generative AI projects (LLM, RAG, agents) is a plus, but not essential.

Working Conditions

  • Salary: €48,000 to €50,000 / year fixed for the expected profile
  • Remote: 2 days / week
  • Location: Lyon-Pusignan

Benefits

  • Stable Group
  • RTT (Reduction of Working Time)
  • Attractive Health Insurance
  • Transport allowance 50% covered

Recruitment Process

  • Application processing and meeting with Audrey THIERY
  • 1 meeting with the Chief Data Officer
  • 1 meeting with the N+2

Skills

AWSCDKCloudFormationData LakeData WarehouseDevOpsInfrastructure as CodeLLMPythonPySparkQlik SenseRAGTalendTerraform

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free