Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Location: Krakow (Hybrid)
Rate: Up to 2000 PLN/day
Duration: 6-month rolling contract
Overview:
- Excellent opportunity to work with a highly reputable financial services company in Poland!
- We’re seeking a skilled Spark Software Engineer to join a dynamic agile team focused on building the strategic backbone between Dealstores and Operations/Regulatory systems.
- This role is part of a long-term initiative to modernise infrastructure and leverage cloud technologies for enhanced performance and scalability.
Key Responsibilities:
- Design and develop a strategic platform enabling trade executions to flow seamlessly between systems.
- Translate epics and features into robust, scalable functionality.
- Collaborate closely with agile pod members during sprints to deliver product requirements.
- Work directly with the product team to understand and implement required features.
About the Team:
You’ll be part of the Digital Operations stream within the Investment Bank, a technology group driving transformation through simplification and innovation. The team supports regulatory and data clients by delivering cutting-edge solutions using the latest technologies. With a global footprint across London, New York, Singapore, Zurich, Hong Kong, Poland, and India, the team values diversity, autonomy, and continuous learning.
Required Skills & Experience:
- Bachelor’s degree in Computer Science or relevant certification
- Strong hands-on experience with Spark and PySpark for scalable data processing
- Strong experience with Azure Databricks
- Proven experience with Kafka for real-time and batch data workflows
- Solid understanding of cloud architecture, preferably Azure (AWS or GCP also considered)
- Proficient in CI/CD pipelines (GitLab, ADO, or GitHub)
- Skilled in test-driven development and software design principles
- Experience with Java backend development (desirable)
- Knowledge of Kubernetes and modern data infrastructure (desirable)
Interested?
If you're passionate about data engineering and cloud technologies, and want to be part of a forward-thinking, agile team, we’d love to hear from you.
Key Skills
Ranked by relevanceReady to apply?
Join Caspian One and take your career to the next level!
Application takes less than 5 minutes

