ARHS Group, part of Accenture , is looking for a Senior / Medium Data Engineer to join the team of one of our strategic clients in the insurance sector . You will work onsite at the client’s premises and contribute to the design and evolution of their next-generation data platform.
This role is ideal for a Data Engineers who want to work with modern data architectures , build scalable data platforms , and collaborate closely with cross-functional teams.
We are hiring two levels :
- Medium Level: 3+ years of experience
- Senior Level: 5+ years, with strong ownership, coordination skills, and project-driven mindset
The Work
- Design, build, and enhance a robust and scalable data platform using technologies such as Spark, Apache Iceberg, Airflow, Trino, Docker, Kubernetes.
- Develop and industrialize data pipelines , applying CI/CD practices (GitHub Actions, Argo, automated testing, observability).
- Support internal teams in creating, optimizing, and operationalizing data workflows.
- Integrate key data management components: metadata, lineage, data dictionaries.
- Contribute to the evolution of modern data architectures (Data Lakehouse, Data Mesh, Data Fabric).
- Participate in sprint planning, technical documentation, and project follow-up.
- Collaborate with platform engineering, business stakeholders, and cross-functional teams to ensure alignment with enterprise data strategy.
Additional expectations for Senior level
- Lead and coordinate project activities and priorities.
- Drive discussions with stakeholders and ensure timely delivery of platform evolution initiatives.
- Take ownership of architectural decisions and propose long-term improvements.
Onsite at client site
This role requires an onsite presence with our clients and partners to support project delivery and strengthen client relationships.
Our roles require in-person time to encourage collaboration, learning, and relationship-building with clients, colleagues, and communities. As an employer, we will be as flexible as possible to support your specific work/life needs.
HERE’S WHAT YOU’LL NEED
- Degree in Computer Science, Engineering, or a related field.
- Hands-on experience with Spark (Scala or Python) and Airflow.
- Solid understanding of modern data architectures (Data Lakehouse, Data Mesh, Data Fabric).
- Experience with Apache Iceberg and Trino (or similar storage/query layers).
- Strong containerization skills (Docker, Kubernetes).
- Good knowledge of CI/CD tooling (GitHub Actions, Argo).
- Familiarity with testing, versioning, observability, and automation best practices.
- Experience with metadata management or data governance is an asset.
- Fluency in French (mandatory)
- Comfortable working in English .
- Strong communication and collaboration abilities.
- Ability to present ideas clearly to technical and non-technical audiences.
- Ownership mindset and ability to work autonomously.
For Medium Level
- Minimum 3 years of experience in Data Engineering or related roles.
For Senior Level
- Minimum 5 years of experience + proven ability to lead/co-coordinate data platform projects.
- Strong organizational, prioritization, and project-management mindset.
As part of our security and compliance procedures, candidates will be required to undergo a criminal background check prior to employment.
#J-18808-Ljbffr