Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Freelance Data Engineer – Databricks Migration Project
Location: Brussels, Belgium (1 day per week on-site)
Duration: 12 months
Start Date: ASAP
Rate: €950 per day
Client: Leading Investment Banking Institution
Project Overview
Our client, a sector-leading investment banking organisation, is seeking an experienced Freelance Data Engineer with strong Databricks expertise to support a large-scale data platform migration programme.
The project involves migrating legacy data infrastructure to a modern, cloud-based Databricks environment, ensuring scalability, performance optimisation, and regulatory compliance within a highly secure financial services landscape.
Key Responsibilities
- Lead and support the migration of existing data pipelines to Databricks
- Design, develop, and optimise data workflows within a cloud environment
- Refactor legacy ETL processes into scalable Databricks solutions
- Collaborate with data architects, platform teams, and business stakeholders
- Ensure data quality, governance, and security standards are maintained
- Provide technical guidance and best practice recommendations
- Support testing, validation, and production deployment
Required Experience
- Proven hands-on experience with Databricks (essential)
- Strong background in data migration and modernisation projects
- Solid experience with Spark (PySpark/Scala)
- Experience working within cloud platforms (Azure preferred)
- Strong understanding of data architecture and large-scale data processing
- Previous experience within financial services or investment banking (highly desirable)
- Excellent stakeholder communication skills
Key Skills
- Databricks (core requirement)
- Apache Spark / PySpark
- Azure Data Platform (ADLS, Data Factory, etc.)
- SQL
- CI/CD for data pipelines
- Data governance & regulatory awareness
Working Model
- 12-month freelance contract
- 1 day per week on-site in Brussels
- €1300 per day
This is a high-impact role within a mission-critical transformation programme at a leading investment bank.
Key Skills
Ranked by relevanceReady to apply?
Join Next Ventures and take your career to the next level!
Application takes less than 5 minutes

