Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
CGS provides an extensive array of ICT Services, focusing on Cloud, Data Centre operations, Networking, Cybersecurity, BI and Data Warehouse, Big Data, Service Desk, Proactive Monitoring, Operations and Support, Service Management, Project and Programme Management, and Professional Services.
Responsibilities:
- Design, configure, and deploy Kafka clusters to support high-volume, real-time data streams across multiple environments
- Design, implement, and maintain CI/CD pipelines using Azure DevOps (Repos, Git, Pipelines)
- Develop and maintain automation scripts using Bash (Python is a plus)
- Manage, configure, and automate infrastructure using Ansible
- Perform Linux system administration and work extensively with Linux commands
- Deploy and manage applications and infrastructure across AWS and Azure cloud platforms
- Implement, support, and troubleshoot Kafka-based messaging and streaming solutions, including installation and configuration
- Design and document architecture diagrams for solutions and deployments
- Monitor platform health and troubleshoot issues using tools such as Splunk
- Ensure system reliability, scalability, and security across all environments
- Collaborate with development and operations teams to improve delivery processes and platform standards
- Stay up to date with advancements in the Kafka ecosystem and best practices in DevOps tooling
- Strong understanding of Kafka architecture, internals, and the broader ecosystem
- University degree in computer science, mathematics, physics or engineering (or equivalent applied experience) (M.Sc. is desirable)
- 5+ years of experience working with Kafka in large-scale, production environments
- Hands-on expertise with Confluent Kafka and Confluent Cloud
- Proficiency with Azure DevOps (Repos, Git, Pipelines)
- Strong Bash scripting skills; Python experience is a plus
- Solid knowledge of Ansible and Linux system administration
- Experience deploying to and managing workloads in AWS and Azure
- Ability to create clear, accurate architecture and deployment diagrams
- Knowledge of Infrastructure-as-Code (IaC) tools such as Terraform and Ansible
- Familiarity with DevOps tools such as GitHub, Nexus, Jira, Azure DevOps, and Splunk
- Exposure to containerization technologies (Docker, Kubernetes) is a plus
- Understanding of cloud and DevOps security best practices
- Excellent communication skills and the ability to work collaboratively in a team environment
- Self-motivated, proactive, and comfortable working in a fast-paced environment
Key Skills
Ranked by relevanceReady to apply?
Join Cosmote Global Solutions and take your career to the next level!
Application takes less than 5 minutes

