Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Our client is an integrated shipping services company that has established itself as an independent carrier with a fresh and innovative approach to global logistics. The company operates a smart and efficient worldwide network, delivering reliable shipping services and ensuring stable operations for customers across the globe.
In this role, you will design and build event-driven solutions using Kafka and related technologies, focusing on stream processing applications, data pipelines, and monitoring solutions built with the ELK stack.
Responsibilities:
- Design and develop event-driven architectures using Apache Kafka and Kafka Streams;
- Build and maintain stream processing applications and data pipelines;
- Design and implement ELK pipelines, indexes, dashboards, and alerts;
- Lead architecture discussions and contribute to the design of Kafka-based event streaming systems;
- Configure Kafka connectors and manage Kafka topics;
- Develop scalable event-based processing applications according to architecture and design specifications;
- Support development and QA teams throughout the software development lifecycle;
- Contribute to monitoring, observability, and reliability of streaming platforms.
- 4+ years of experience building Kafka-based applications;
- Strong experience with Kafka ecosystem components:
- Kafka Streams (including Processor API)
- Kafka Connect
- Schema Registry
- 4+ years of experience with Java and Spring, implementing Kafka Streams applications;
- Hands-on experience with the ELK stack (Elasticsearch, Logstash, Kibana) for monitoring and observability;
- Experience with Logstash Configuration Language (LCL) for configuring pipelines (inputs, filters, outputs);
- Solid knowledge of SQL and Oracle PL/SQL, including query optimization;
- Experience working with REST APIs;
- Experience using Kafka CLI tools and REST APIs for managing Kafka resources;
- Experience working in Linux/Unix environments;
- Experience with IntelliJ IDEA;
- Experience contributing to DevOps processes for Kafka and ELK.
- Experience managing and maintaining ELK servers;
- Experience with Confluent Kafka Cloud;
- Experience with Python;
- Exposure to AI/ML-related projects.
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
- not applicable for freelancers
Key Skills
Ranked by relevanceReady to apply?
Join N-iX and take your career to the next level!
Application takes less than 5 minutes

