Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Our client, a Private European Investment Bank is looking for a Data Engineer to design, build, and maintain the data pipelines, backend services, and governance frameworks that power real-time decisioning and analytics across the organisation. This role suits someone who enjoys solving complex data problems, building robust backend systems, and working with modern cloud and streaming technologies in a regulated environment.
Responsibilities
- Design, develop, and maintain real-time data pipelines, backend services, and workflow orchestration for decisioning, reporting, and data processing.
- Define data requirements, models, and transformation logic across structured and unstructured data sets.
- Write high-quality, secure, well-tested code in Python, Scala, or similar languages.
- Build software and processes that strengthen data governance, data quality, and data security.
- Support scalable data architectures using Azure, Delta/Live Tables, and distributed computing frameworks.
- Troubleshoot and resolve data issues across pipelines, storage layers, and downstream systems.
Requirements
- BS/MS in Computer Science, Data Engineering, Information Systems, or equivalent experience.
- At least 5+ years building end-to-end data systems, ETL/ELT pipelines, and workflow management using tools such as Airflow, Databricks, or similar.
- Strong SQL skills for diagnosing data issues and validating complex transformations.
- Hands-on experience with large-scale data processing using Databricks, PySpark, Spark Streaming, and Delta Tables.
- Experience in Azure cloud data environments, including Data Lake Storage and CI/CD deployment workflows.
- Familiarity with microservices platforms (Kubernetes, Docker) and event-driven systems (Kafka, Event Hub, Event Grid, Flink).
- Experience developing in Python, Scala, JavaScript, Java, or C#.
- Knowledge of dbt, Data Vault, or Microsoft Fabric is a plus.
- Prior experience in banking or financial services is highly preferred.
(12-Month Contract — Convertible to Permanent)
Key Skills
Ranked by relevanceReady to apply?
Join Nicoll Curtin and take your career to the next level!
Application takes less than 5 minutes

