Skip to content
mimi

Advanced Data Engineer

Zuehlke Engineering Vietnam LLC.

flexible Full-time Senior 2d ago

About the role

About Zühlke

Founded in Switzerland in 1968, Zühlke is owned by its partners and located across Europe and Asia. We are a global transformation partner, with engineering and innovation in our DNA. We help clients envision and build their businesses for the future – to run smarter today while adapting for tomorrow’s markets, customers, and communities. Our multidisciplinary teams specialise in tech strategy and business innovation, digital solutions and applications, and device and systems engineering. We excel in complex, regulated spaces including health and finance, connecting strategy, tech implementation, and operational services to help clients become more effective, resilient businesses.

Role

Advanced Data Engineer – design, implement, and maintain data pipelines while working closely with clients to understand their data needs and deliver tailored solutions that drive business insights.

Responsibilities

  • Act as a trusted advisor, guiding customers toward successful technical solutions for their data challenges.
  • Communicate the “what”, “why”, and “how” of proposed solutions to technical and non‑technical stakeholders.
  • Develop, test, and monitor distributed data processing pipelines.
  • Extrapolate versatile data sets and sources to produce high‑quality, reproducible datasets in a scalable and maintainable way.
  • Collaborate with other data roles such as Architects, Software Engineers, and Data Scientists.
  • Understand the needs of many types of producers and consumers for our data services, ensuring our products meet their requirements.
  • Deliver projects in an Agile way, building iteratively to produce value from data early and frequently.
  • Keep yourself technically sharp, staying open to learning new concepts and technologies.

Requirements

  • University degree (ETH, Uni, FH) in computer science, software engineering, data science, or a comparable education.
  • At least 3 years in data or software engineering positions.
  • Experience designing, building, and maintaining data products that meet the needs of data consumers.
  • Understanding of common approaches to data analysis, data visualization, and optionally data science.
  • Experience with a variety of data architectures (e.g., Data Lake, Data Lakehouse, Medallion, Data Products, streaming, batch processing).
  • Experience with Cloud Data Platforms such as Databricks, Snowflake, Microsoft Fabric, or Amazon SageMaker.
  • Practical data programming skills in Python and SQL.
  • Hands‑on skills or strong interest in technologies such as Apache Spark, Delta Lake, Airflow, Kafka, Kubernetes, Terraform, FastAPI, and programming languages (Scala, R, Java, TypeScript, .NET).
  • Hands‑on experience with both relational and non‑relational databases.
  • Familiarity with big data infrastructures and concepts for storing and processing large and/or heterogeneous data volumes.
  • Practical knowledge of handling varied types of structured and unstructured data (text, tabular, graph, time‑series, geospatial, image, etc.).
  • Experience with Agile development and DevOps methodologies.
  • Fluency in both German and English.

Benefits

  • Work‑life blend: safe & healthy workplace, flexible working hours, and the possibility to work from home.
  • Profit‑share scheme: additional profit share based on the company’s success in the previous year.
  • Global and diverse community: collaboration across 16 offices worldwide, inclusive culture, annual team camps, year‑end parties, and local festivities.
  • Committed to development: investment in personal and professional growth, empowering you to build the skills needed to make a positive impact for yourself and our clients.

We welcome people from all backgrounds, regardless of gender, personality, national origin, race, religion, colour, sexual orientation, gender identity, age, marital status, disability, or veteran status.

Skills

AirflowAmazon SagemakerApache SparkDatabricksDelta LakeDevOpsFastAPIJavaKafkaKubernetesMicrosoft FabricPythonRScalaSnowflakeSQLTerraformTypescript.NET

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free