At Omio, our data platform is the nerve centre of the entire organisation, it’s how we understand and steer our business, and it mediates how departments communicate with each other. With massive event volumes, over 1,000 travel providers, plus a host of other third parties, there is a real richness and variety of data to work with; consolidating, aggregating and standardising ready for fast delivery to our teams, our customers and our partners.
Please ONLY APPLY if you can work in our Berlin office 2-3days a week and are currently in Germany or Europe.
The role will be responsible for designing, building and maintaining our pipelines and data warehouses/data-lakes; tracking data quality and health; ensuring appropriate availability and latency for our data products; and the backbone infrastructure of our semantic and data governance layers.
This is a hands on technical role but will also be leading a small team of 2 (an engineer and architect - both in Berlin) and working closely with the VP Data and Analytics to deliver on the roadmap for our data strategy.
You would expect to be spending more than 50% of your time in hands-on technical work, with 30% requirements gathering and 20% managerial.
The ideal profile for this role will be someone who:
GCP (BigQuery, Looker, Airflow, dbt, Docker, SQL, Python, Terraform).
Experience in large-scale data processing with Apache Spark or similar technologies, and proficiency in Scala, Java, or Python.
Typ:
VollzeitArbeitsmodell:
HybridKategorie:
Development & ITErfahrung:
LeitendArbeitsverhältnis:
AngestelltVeröffentlichungsdatum:
22 Okt 2025Standort:
Berlin
Möchtest über ähnliche Jobs informiert werden? Dann beauftrage jetzt den Fuchsjobs KI Suchagenten!