Data Engineer
Our client is a cutting-edge startup energy trading house specializing in quantitative trading strategies across European Energy markets.
The quantitative trading team leverages models and data to drive decision-making. They build tools for their own use which means quality, speed, and precision truly matter. To help accelerate innovation, they’re looking for a Data Engineer with a strong focus on performance, reliability, and clean engineering. Expect deep technical work, minimal meetings, and no external stakeholder juggling just direct impact.
You enjoy
    - Writing efficient C++ code that delivers low-latency execution and high reliability.
 
    - Organizing and querying large volumes of time-sensitive data using BigQuery.
 
    - Getting the details right from error handling to naming conventions because precision matters.
 
    - Collaborating in a team that values humility, kindness, and technical excellence.
 
    - Taking ownership of your work from design to deployment and beyond.
 
You can support the team with
    - Building and maintaining low-latency, high-availability data pipelines in C++ and Python.
 
    - Engineering performant and scalable data warehousing solutions in BigQuery, using advanced SQL features (UDFs, ITVFs, MTVFs).
 
    - Deploying production systems using Docker, Git, and CI/CD best practices.
 
    - Leveraging AWS services including S3, ECR, ECS, Fargate, and Lambda to build resilient cloud-native pipelines.
 
    - Designing systems with high conscientiousness anticipating failure modes, writing clean code, and documenting responsibly.
 
    - Working in a collaborative, agreeable way that supports shared goals and constructive feedback.
 
Bonus if you bring
    - Experience in short-term power markets (Day Ahead, Intraday Auctions and Continuous, Imbalance, EPEX SPOT, EXAA, Nordpool, and EEX, as well as relevant data like weather, trades, and order book updates).
 
    - Knowledge of infrastructure-as-code and cloud architecture best practices (e.g. Pulumi, Terraform, AWS Well-Architected Framework).
 
    - Familiarity with data pipeline orchestration and monitoring tools (e.g. Airflow, DBT, Grafana) or MLOps tooling.
 
    - Programming experience in Go or Rust.
 
What’s on offer
A deeply technical, fast-moving environment where you work with traders, not for them.
The systems you build go into production immediately providing a rare and rewarding feedback loop from both peers and the market.
A high-agency team culture that values care, clarity, and cooperation.