Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
About the Project
We focus on developing global solutions across multiple regions, aiming for efficient, user-friendly, and scalable systems. Strong and robust data products are key to mitigating risk and enhancing our capabilities.
Your role will contribute to:
Managing a Data Hub with operations across multiple countries.
Incorporating new data, products, and countries into the hub.
Supporting internal teams that consume this information.
Collaborating with different countries and data providers.
We follow SCRUM/Agile methodology, emphasizing continuous delivery and rapid feedback. Our tech stack includes: Scala, Python, Java, SQL/HQL, Airflow, ControlM, GitHub, Hive, Databricks, Azure, S3, Maven.
Quality assurance involves unit tests, automation, and peer code reviews.
What You Will Be Doing
As a Backend Spark Developer, you will:
Develop, test, and deploy ETL processes with Spark/Scala, including data transfer, technical validations, and business logic.
Collaborate in high-performance Scrum teams.
Document solutions using JIRA, Confluence, ALM.
Ensure delivery quality with integration and unit testing.
Work with a technical specialist to improve architecture and implementation.
Integrate data from various sources with cross-functional and cross-country teams.
Develop and maintain data hubs and pipelines in a microservices architecture.
Collaborate on maintaining and improving CI/CD pipelines.
What We Are Looking For Required Qualifications:
Bachelor’s degree in Computer Science or a related field.
1+ years of experience in data/software engineering (preferably in banking).
Experience with integration solutions (API, microservices).
Backend development with Big Data technologies (Apache Spark).
Familiarity with Agile methodology.
Knowledge of CI/CD tools (Git, GitHub).
Knowledge of SQL databases.
English proficiency (B2+).
Preferred Qualifications:
GitHub Actions
Scala or Python programming
Bash scripting
ControlM, Airflow
Software development lifecycle tools (HP ALM)
Basics of cybersecurity & quality tools (Sonar)
Cloud computing knowledge (Docker, Kubernetes, S3, Azure, AWS, EMR, Databricks)
SOA architecture
Analytics skills
Key Skills
Ranked by relevanceReady to apply?
Join Ascendion and take your career to the next level!
Application takes less than 5 minutes

