Cheil Germany GmbH

Data Engineer (f/m/d)

Cheil Germany GmbH WorkFromHome

Stellenbeschreibung:

Join to apply for the Data Engineer (f/m/d) role at Cheil Germany GmbH

We are looking for a skilled Data Engineer to join our team. The successful candidate will be responsible for designing, developing, and maintaining scalable data pipelines using Apache Airflow or similar orchestration tools.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using Apache Airflow or similar orchestration tools for building and scheduling workflows.
  • Architect and implement efficient data models (star schema, snowflake schema, normalized/denormalized structures) in Snowflake to support business intelligence and analytics needs.
  • Collaborate with analytics and data science teams to translate business requirements into data models and ETL processes.
  • Optimize Snowflake data warehouse for performance, cost, and scalability.
  • Build and maintain CI/CD workflows using Bitbucket Pipelines for automated deployment and testing of data pipelines and models.
  • Leverage AWS services (S3, Lambda, ECS) to support data ingestion, transformation, and storage.
  • Implement data quality frameworks, perform data validation, monitoring, and anomaly detection to ensure data accuracy and consistency.
  • Monitor data pipeline health and troubleshoot issues to ensure reliability and accuracy.
  • Ensure adherence to data governance policies and best practices.

Requirements

  • Proven experience as a Data Engineer or similar role, with a strong emphasis on data modeling and warehousing.
  • Expertise in designing and implementing data models tailored for analytical workloads in Snowflake.
  • Hands-on experience with Apache Airflow or similar tools for building and scheduling data workflows.
  • Proficiency with AWS services relevant to data engineering (S3, Lambda, EC2).
  • Experience with Bitbucket Pipelines or comparable CI/CD tools.
  • Advanced SQL skills and experience in query optimization.
  • Proficient in Python or other scripting languages for ETL and data processing.
  • Strong understanding of data quality concepts, data validation techniques, and experience implementing data quality checks.
  • Solid understanding of data warehousing concepts, including dimensional modeling, normalization/denormalization, slowly changing dimensions (SCD), and fact/dimension tables.
  • Familiarity with big data concepts and handling large-scale datasets.
  • Strong collaboration and communication skills to work across technical and business teams.

Nice to Have

  • Experience with containerization technologies such as Docker
  • Familiarity with data governance, security standards, and best practices.

What We Offer

  • 30 days of vacation per year
  • Hybrid work model with flexible working hours
  • Opportunities for professional growth and development
  • Attractive salary and benefits package
  • Chance to work with a global company and international teams

We are an equal opportunities employer and welcome applications from all qualified candidates. We are committed to creating a diverse and inclusive workplace and encourage applications from underrepresented groups.

#J-18808-Ljbffr
NOTE / HINWEIS:
EnglishEN: Please refer to Fuchsjobs for the source of your application
DeutschDE: Bitte erwähne Fuchsjobs, als Quelle Deiner Bewerbung

Stelleninformationen

  • Typ:

    Vollzeit
  • Arbeitsmodell:

    Vor Ort
  • Kategorie:

  • Erfahrung:

    2+ years
  • Arbeitsverhältnis:

    Angestellt
  • Veröffentlichungsdatum:

    04 Nov 2025
  • Standort:

    WorkFromHome

KI Suchagent

AI job search

Möchtest über ähnliche Jobs informiert werden? Dann beauftrage jetzt den Fuchsjobs KI Suchagenten!