Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Job Location – Zurich, Switzerland
Experience (Years): 4+ years
Pay Rate – TBN
Start Date - ASAP
6 Months Contract – Renewable
Role Description
- Operate Global Data Platform components including VM Servers, Kubernetes, Kafka, and applications such as Apache stack, Collibra, Dataiku, and similar tools.
- Implement automation of infrastructure, security components, and CI/CD pipelines to ensure optimal execution of ELT/ETL data pipelines.
- Develop and implement solutions to improve resiliency of data pipelines, including platform health checks, monitoring, and alerting mechanisms.
- Apply DevSecOps and Agile approaches to deliver holistic, integrated solutions in iterative increments.
- Liaise and collaborate with enterprise security, digital engineering, and cloud operations teams.
- Review system issues, incidents, and alerts to identify root causes and drive resolution.
- Stay current with industry developments and technology trends.
Experience
- 4–6 years of relevant experience.
- At least 5 years designing large-scale distributed systems.
- Experience with streaming and file-based ingestion (Kafka, Control-M, AWA).
- DevOps experience with Jenkins, Octopus, and optionally Ansible, Chef, XL tools.
- Experience with on-prem Big Data architectures and cloud migration (useful).
- Experience integrating Data Science Workbenches (e.g., Dataiku).
- Experience working with Agile methodologies (Scrum, SAFe).
- Supporting enterprise reporting and data science platforms.
Technical Skills
- Knowledge of data lakes, delta lakes, data meshes, and data platforms.
- Distributed technologies: S3, Parquet, Kafka, Kubernetes, Spark.
- Programming and scripting: Python, Java, Scala, R, Linux scripting, Jinja, Puppet.
- Infrastructure and security: firewall rules, VM setup, Kubernetes scaling.
- Containers and CI/CD: Docker, Harbor, CI/CD pipelines.
Education
- Higher education (FH, WI)
Language Skills
- English – essential
- German – beneficial
Key Skills
Ranked by relevanceReady to apply?
Join Acquism SARL and take your career to the next level!
Application takes less than 5 minutes

