Skip to content
mimi

Data Engineer

Myticas Consulting

Crediton · On-site Contract Mid Level 5d ago

About the role

Key Responsibilities

  • Design, build, and deploy end-to-end data pipelines using Azure Synapse Analytics.
  • Integrate data from various third‑party systems, ensuring high‑quality, clean, and consistent data for downstream use.
  • Work with large datasets, performing transformations and applying best practices in data architecture.
  • Ensure data accuracy, integrity, and compliance throughout the pipeline development process.
  • Assess existing data pipelines for performance, accuracy, and scalability.
  • Implement optimizations to enhance efficiency and reduce processing times.

Data Modeling

  • Design and implement optimized data models that support analytics and reporting.
  • Develop both dimensional and normalized models aligned with business objectives.

Integration with Third‑Party Systems

  • Use Azure Synapse to extract data from third‑party platforms (e.g., SaaS applications, CRM, ERP systems).

Data Layer Creation for Reporting

  • Develop and maintain structured data layers that support reporting and business intelligence tools.
  • Ensure data is accurate, accessible, and optimized for analysis.
  • Work closely with data architects, business analysts, and stakeholders to define business requirements and translate them into technical solutions.
  • Collaborate with cross‑functional teams to build efficient and reliable data flows.

Monitoring & Troubleshooting

  • Continuously monitor data pipelines for performance and reliability.
  • Troubleshoot discrepancies and resolve issues in a timely manner.

Key Skills and Qualifications

  • Azure Synapse Analytics expertise – strong knowledge of both data lake and data warehouse features.
  • ETL/ELT development – hands‑on experience creating and managing complex data pipelines within Azure Synapse.
  • Data modeling – ability to design and implement both normalized and dimensional data models.
  • Azure Data Factory – experience using Azure Data Factory or similar tools for ETL workflow orchestration.
  • Third‑party system integration – experience integrating data from external applications using APIs and connectors.
  • SQL proficiency – strong SQL skills for data transformation, querying, and optimization.
  • Programming skills – proficiency in scripting languages such as Python, Scala, or PowerShell.
  • Problem‑solving – strong troubleshooting and performance optimization skills.
  • Collaboration – ability to work with cross‑functional teams and align technical solutions with business goals.

Preferred Qualifications

  • Experience with additional Azure services (Azure Data Lake Storage, Azure Blob Storage, Azure SQL Database).
  • Familiarity with reporting tools such as Power BI or Tableau.
  • Understanding of cloud‑based data architectures, including event‑driven and serverless computing.

Education and Experience

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field (or equivalent experience).
  • 3–5 years of hands‑on experience in a data engineering role, with a focus on Azure Synapse or similar data platforms.

Additional Information

  • Seniority level: Mid‑Senior level
  • Employment type: Contract
  • Job function: Information Technology and Engineering
  • Industries: Technology, Information and Media, Software Development

Skills

Azure Data FactoryAzure Synapse AnalyticsData Lake StoragePower ShellPythonScalaSQL

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free