Skip to content
mimi

Junior Data Enginee

Rest Trail

Fresnes · On-site Contract Entry Level Today

About the role

About

The Junior Data Engineer supports the design, development, and maintenance of data pipelines and infrastructure that enable efficient data collection, processing, and analysis. This role focuses on ensuring data reliability, accessibility, and quality to support business intelligence and advanced analytics initiatives.

Responsibilities

  • Assisting in building and optimizing ETL/ELT pipelines, integrating data from multiple sources, and maintaining databases and data warehouses.
  • The engineer collaborates with data analysts, data scientists, and other stakeholders to understand data requirements and ensure seamless data flow across systems.
  • Monitoring data performance, troubleshooting issues, and implementing improvements are also key aspects of the role.

Qualifications

  • Strong foundation in programming and data handling, with familiarity in languages such as Python or SQL.
  • Basic knowledge of database systems, data modeling, and cloud platforms is beneficial.
  • Understanding of data pipeline concepts, APIs, and workflow orchestration tools is considered an advantage.
  • Strong problem‑solving skills, attention to detail, and a willingness to learn new technologies are essential.
  • The ability to work collaboratively in a team environment while managing tasks independently is highly valued.
  • A degree in computer science, information technology, or a related field is preferred.
  • Curiosity, adaptability, and a passion for working with data are key attributes for success in this role.

Requirements

  • Strong foundation in programming and data handling.
  • Familiarity in languages such as Python or SQL.
  • Basic knowledge of database systems, data modeling, and cloud platforms.
  • Understanding of data pipeline concepts, APIs, and workflow orchestration tools.
  • Strong problem-solving skills.
  • Attention to detail.
  • Willingness to learn new technologies.
  • Ability to work collaboratively in a team environment while managing tasks independently.

Responsibilities

  • Assisting in building and optimizing ETL/ELT pipelines.
  • Integrating data from multiple sources.
  • Maintaining databases and data warehouses.
  • Collaborating with data analysts, data scientists, and other stakeholders to understand data requirements and ensure seamless data flow across systems.
  • Monitoring data performance, troubleshooting issues, and implementing improvements.

Skills

APIsCloud platformsData modelingDatabase systemsETL/ELTPythonSQLWorkflow orchestration tools

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free