Data Platform Engineer
Tractive
We’ve introduced a 3-day weekend for all full-time employees, at full salary. From June 2022, we’ll work 35-hour, 4-day weeks
Posted 1 month ago
Only considering candidates eligible to work in Linz, Austria ⚠️
About us
Tractive is the most trusted name for keeping cats and dogs safe and healthy. The secret to our success? A team of truly unique individuals who care about each other just as much as going the extra mile for pet parents. Want to be part of our story? Check out this role and see if you’re a good match!
Your territory
As part of our Data team, you will:
- Enable data teams across Tractive by pioneering a Data-Platform-as-a-Service strategy. You’ll build standardized workflows that allow Data Engineers and Analysts to deliver trustworthy insights quickly—empowering data-driven decisions across the company
- Build and maintain self-service tooling such as data catalogs, lineage tracking, and dashboards for cost and access. You’ll guide the selection and implementation of solutions that improve data discoverability, trust, and cost transparency
- Strengthen and scale our data platform by designing and extending resilient, secure, scalable, cloud-native Data Lakehouse and Warehouse solutions on AWS
- Apply networking architecture and security best practices including VPC design, IAM, encryption, and data classification
- Ship reliable and observable data products through established coding standards, automated testing, and CI/CD pipelines that make data changes fast, safe, and developer-friendly
- Implement monitoring, alerting, and cost-optimization strategies so we can confidently and proactively manage our infrastructure
- Partner across engineering teams to ensure the data platform continues to meet evolving needs, performance expectations, and development workflows
- Collaborate closely with backend, firmware, and mobile teams to build and operate a unified data platform used throughout Tractive
- Bring in your fresh ideas to make Tractive better - you’ll never hear the phrase “...because that’s how we’ve always done things”
- Continuously grow personally and professionally, take ownership of areas that show your potential, and attend workshops which help you get to the next level
Your profile
Key requirements:
- Proven experience in architecting and delivering data platforms, including pipelines, storage formats, and compute orchestration
- Strong software engineering fundamentals – clean code, automated testing, Git workflows, CI/CD (e.g. GitHub Actions, Jenkins)
- Solid understanding of networking and cloud security, including subnets, routing, IAM, key and secret management
- Proficient in Python, with hands-on experience using PySpark for large-scale data processing
- Experienced with Infrastructure as Code (IaC) such as AWS CloudFormation, CDK, or Terraform
- Hands-on experience with cloud data analytics stacks (AWS, Azure, GCP), covering storage, processing, and analytics services
- Familiarity with Open Table Formats (e.g. Iceberg, Delta Lake, Hudi) and Data Lakehouse principles
- Very good English skills
Does this sound like you?
- Passionate about data and its potential to drive impact
- Meticulous about quality, security and cost efficiency
- A relentless problem solver who thrives on autonomy and clear ownership
- Excited to work in Austria and collaborate face‑to‑face with an outstanding team