TG
Senior Data Engineer
Tractive GmbH
Pasching · On-site Senior Today
About the role
Experteer Overview
In this role, you will shape Tractive’s analytical data platform to extract insights from location, health, and telemetry data at scale. You’ll enable data-driven product and business decisions by building scalable data infrastructure and self-service tooling. You will own data governance, quality, and security to ensure trustworthy data across teams. You’ll work across engineering to evolve the data stack on AWS, balancing reliability, cost, and developer productivity. This is a chance to impact animal wellbeing through data, with an international, collaborative team.
Leistungen / Benefits
- Design and evolve the analytical data platform to derive insights from GPS, health, business, and device telemetry data
- Build and maintain self-service tooling (data catalog, lineage) to improve data discoverability and trust
- Lead data governance and quality initiatives with metadata management and compliance practices
- Own end-to-end data infrastructure on AWS (Lambda, Glue, Airflow, Redshift, dbt)
- Ship reliable, observable data products via CI/CD, automated testing, and coding standards
- Implement monitoring, alerting, and cost-optimization for proactive infrastructure management
- Collaborate with engineering to meet evolving data platform needs and workflows
- Contribute fresh ideas to continuous improvement and avoid “we’ve always done it this way”
- Grow professionally through workshops and taking ownership of areas showing potential
Aufgaben
- Proven experience building and operating data platforms with pipelines, storage, and orchestration at scale
- Strong software engineering fundamentals: clean code, automated testing, Git workflows, CI/CD (e.g., GitHub Actions, Jenkins)
- Solid networking and cloud security understanding (subnets, routing, IAM, key/secret management) for secure cloud infra
- Experience in data quality and metadata management using OpenMetadata and AWS Glue; governance and security best practices
- Proficiency in Python with PySpark for large-scale processing
- Hands-on experience with cloud data analytics stacks (AWS, Azure, GCP) and Infrastructure as Code (CloudFormation, CDK)
- Very good English skills
- Valid Austrian work permit
Zentrale Anforderungen
Skills
AirflowAWSAWS CloudFormationAWS GlueAWS IAMAWS LambdaAWS RedshiftAzureCDKCI/CDdbtGCPGitHub ActionsGitJenkinsOpenMetadataPythonPySpark
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free