Your Mission 
 At Ratepay, we are dedicated to revolutionizing the world of digital payments. Established in 2009, we have grown to become Europe\'s leading white-label Buy Now Pay Later payment provider, processing over 2.5 million transactions per month. Our diverse team of 200+ professionals collaborates to create customized, state-of-the-art payment solutions for major online retailers and marketplaces. 
 We are building a future-ready domain driven Data warehousing platform that empowers high-impact business domains—Risk, Finance, Commercial, Product, and Compliance—to work with trusted, well-modeled data. Our Data team owns the end-to-end data architecture across ingestion, modeling, quality, and enablement. 
 As a Senior Data Engineer  at Ratepay, you will play a pivotal role in building and optimizing our next-generation cloud-native data platform. You will design and maintain scalable, reliable, and governed data pipelines that power real-time financial services, advanced analytics, and regulatory reporting. You will work closely with BIA, Cloud Infrastructure, Finance, Risk, Compliance, and Domain Engineering teams to shape the next stage of our cloud-native data strategy . 
 Key Responsibilities 
  - Implement and drive the technical architecture  of our cloud-native data platform, spanning AWS (S3, IAM, Glue, EC2) , Snowflake , Kafka , SAP BW  and dbt . 
  - Design and optimize ETL/ELT pipelines  for structured and unstructured data from SAP BW, SAP FI, Navision, Salesforce and Kafka streaming sources. 
  - Ensure data governance, lineage, and FinOps best practices , enabling secure and cost-efficient cloud operations. 
  - Define best practices in cost monitoring, FinOps , and CI/CD for data  to ensure stability and performance at scale. 
  - Collaborate with analysts and data scientists to deliver self-service BI, advanced analytics capabilities  and ML Pipelines . 
  - Engage with regulatory and audit needs , enabling data lineage, encryption, and Role Based Access Control (RBAC) in line with BaFin and GDPR  expectations. 
   
 That\'s What You Bring Along 
  - 5+ years of experience in data engineering, data warehousing, or cloud data architecture . 
  - Expertise in Snowflake, AWS S3, dbt, Kafka, and SQL-based development . 
  - Extensive experience designing and scaling data warehousing platforms using technologies like Snowflake, AWS, dbt, Terraform, Airflow , Kafka  and CI/CD  tools. 
  - Python  proficiency for writing clean, modular and maintainable code. 
  - Proven ability to build modular, reusable pipelines  with clear separation of concerns across Raw layer, Datalake, and Datamart. 
  - Knowledge of cloud architecture patterns , data lineage , and experience with governance in regulated industries . 
  - Strong problem-solving skills and a track record of optimizing performance and cost efficiency  in cloud environments. 
  - Bonus : experience with FinTech, payment processing, or regulatory reporting an advantage. 
   
 Why Join Us:  
  - Be part of mission-critical projects  (e.g., Refinancing partnerships, Anti Money laundering) that impact our financial and risk integrity . 
  - Take ownership of a platform used across departments, with end-to-end visibility from source to decision . 
  - Work with experienced engineers and stakeholders on cloud-native architecture with autonomy and purpose . 
  - Culture of technical ownership , business enablement , and low operational drag . 
   
 That\'s How Working With Us Will Be 
 Equal Opportunities & Diversity 
 We value our diversity and welcome everyone to our team. Regardless of ethnic and social background, religious beliefs, worldview, gender, sexual orientation, physical and mental limitations, age, marital status, educational background and nationality. With over 200 employees and 40 different nationalities, we take our values seriously. These include ownership, growth, integrity, collaboration, customer centricity and inclusion. 
  #J-18808-Ljbffr