Seargin

Data Platform DevOps Engineer

Stellenbeschreibung:

  • Design, deploy, and maintain scalable cloud infrastructure on AWS to support data processing and analytics workloads.
  • Databricks Platform Operations
    • Configure, manage, and optimize Databricks environments for efficient execution of data pipelines and analytical processes.
  • Build and maintain Jenkins-based pipelines to automate build, test, and deployment processes across data and application services.
  • Automation & Scripting
    • Develop Python and shell scripts to automate repetitive tasks, improve workflows, and enhance operational efficiency.
  • Collaborate with data teams to ensure reliable deployment, execution, and monitoring of data pipelines in production environments.
  • System Monitoring & Alerting
    • Implement monitoring solutions and alerting mechanisms to ensure system health, performance, and availability.
  • Performance Optimization
    • Analyze system performance and optimize infrastructure, pipelines, and workloads for scalability and cost efficiency.
  • Security & Compliance Implementation
    • Apply security best practices and ensure infrastructure and processes comply with organizational and regulatory standards.
  • Maintain consistent configuration across development, testing, and production environments using automated tools.
  • Diagnose and resolve infrastructure, deployment, and runtime issues in distributed cloud-based systems.
  • Cross-Functional Collaboration
    • Work closely with data engineers, developers, and stakeholders to align infrastructure solutions with business needs.
  • Documentation & Knowledge Sharing
    • Create and maintain clear technical documentation for infrastructure, pipelines, and operational procedures.

Requirements

  • AWS Expertise
    • Strong hands-on experience with AWS services, including compute, storage, networking, and identity management components.
  • Databricks Experience
    • Practical experience working with Databricks for managing data processing environments and executing data workloads.
  • Experience designing and maintaining Jenkins pipelines for continuous integration and continuous deployment processes.
  • Solid understanding of DevOps principles, including automation, infrastructure as code, and continuous delivery practices.
  • Python Knowledge
    • Working knowledge of Python for scripting, automation, and supporting data-related processes within cloud environments.
  • Strong proficiency in Linux systems, including command-line usage, shell scripting, and system administration basics.
  • Experience with monitoring tools and logging systems to track system performance and troubleshoot issues effectively.
  • Problem-Solving Skills
    • Ability to analyze complex systems, identify root causes, and implement effective solutions in distributed environments.
  • CI/CD Best Practices:
    • Understanding of version control, automated testing, and deployment strategies within modern DevOps workflows.
  • Data Workflow Understanding
    • Familiarity with data pipelines, ETL processes, and challenges related to large-scale data processing systems.
  • Ability to work effectively with cross-functional teams and communicate technical concepts to diverse stakeholders.
  • Agile Experience
    • Experience working in Agile environments, delivering tasks iteratively and adapting to changing requirements.

What we offer

  • B2B Contract
    • Employment based on a B2B contract
  • Stable and Dynamic International Firm
    • Opportunity to work in a stable, dynamically developing international company
  • Engaging Projects and Latest IT
    • Chance to participate in interesting projects and work with the latest information technologies
  • Competitive Rates
    • Attractive remuneration rates offeres
  • Renowned International Projects
    • Involvement in the most prestigious international project
  • MultiSport and Private Medical Care
    • MultiSport and Private Medical Care

Nice to have

  • Streaming/Data Tools Exposure
    • Familiarity with data streaming or messaging systems used in data pipeline architectures.

Work with us

Apply & join the team

Didn’t find anything for yourself? Send your CV

Full name*

E-mail*:

Phone*:

Attach CV (PDF/JPG/PNG up to 10MB)*:

I acknowledge that I have read and understood the information clause related to the recruitment process and I confirm my intent to participate in this recruitment process. *

I confirm my intend to participate in the recruitments announced by the data controller in the future. *

#J-18808-Ljbffr
NOTE / HINWEIS:
EnglishEN: Please refer to Fuchsjobs for the source of your application
DeutschDE: Bitte erwähne Fuchsjobs, als Quelle Deiner Bewerbung

Stelleninformationen

  • Veröffentlichungsdatum:

    30 Mär 2026
  • Standort:

    WorkFromHome
  • Typ:

    Vollzeit
  • Arbeitsmodell:

    Vor Ort
  • Kategorie:

  • Erfahrung:

    2+ years
  • Arbeitsverhältnis:

    Angestellt

KI Suchagent

AI job search

Möchtest über ähnliche Jobs informiert werden? Dann beauftrage jetzt den Fuchsjobs KI Suchagenten!