Skip to content
mimi

Data Ingenieur H/F

EDF

Palaiseau · On-site Full-time 1w ago

About the role

About

At the intersection of essential challenges, join EDF, an international group committed to the energy transition. Within EDF R&D at the EDF LAB Paris Saclay site, our mission is to contribute to the operational performance of the group's units. You will join the SEQUOIA department and more specifically the "Customer Consumption Analysis, Supply Offers" group, composed of about twenty research engineers.

Responsibilities

  • Configure infrastructures appropriately and optimally for actors who manipulate data.
  • Identify constraints and then the technical components to use for collecting, storing, and manipulating data from various internal or external sources.
  • Make data available by organizing it optimally for Data Scientists, Data Analysts, or various developers, i.e., the actors who will process the data.
  • Ensure the operation and monitoring of data flows and applications deployed in production.
  • Be involved in structuring databases (semantic, format, etc.) in the datalake.
  • Participate, depending on the activity, in POCs to test and qualify new solutions proposed by IT services.
  • Propose improvements to the technical foundation (data access, integration, storage, and valorization when they become obsolete).
  • With the contribution of AI, the Data Engineer must master tools and technologies for optimizing data flows directly related to AI, such as real-time data processing systems or the integration of AI algorithms into data pipelines.
  • Work on the automation of large-scale data management and the necessity to work with architectures adapted to increasingly complex Machine Learning and Deep Learning models.

Qualifications

  • Bac +5 degree (Engineer, Master's in MVA type) with a specialization in applied mathematics, statistics, artificial intelligence, or data engineering.
  • Technical skills desired:
    • Programming: specifications, design, development, testing
    • Data modeling and quality assurance using monitoring indicators
    • Relational or NoSQL databases
    • Big Data storage and processing
    • Automated data flows using ETL tools (Extract, Transform, Load)
  • Cross-functional skills: Communication, Problem Solving, Project Management

Tools Mastered

  • Data analysis and visualization tools: PowerBI, Dash, Grafana, Kibana
  • Cloud: Amazon AWS, Microsoft Azure, Google Cloud Platform

Experience

A first experience would be appreciated.

What will make the difference

  • Knowledge in data science would be a plus.
  • Experience in Big Data processing.

Skills

AWSAzureBig DataCloud PlatformDashData LakeDeep LearningETLGrafanaGoogle Cloud PlatformIAKibanaMachine LearningNoSQLPowerBISQL

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free