Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Databricks Data Engineer – Dublin (Hybrid, 3 days/week)
Type: Permanent | Location: Dublin | Hybrid: 3 days/week onsite
Sector: Multi-client Data Projects | Level: Mid-Level (min 3 years Databricks experience)
Are you a data engineer who’s happiest when building things that just work — scalable pipelines, clean architectures, and dashboards that tell the right story at the right time?
We’re partnering with several enterprise clients who are investing heavily in their Databricks and cloud data ecosystems, and we’re looking for engineers who want to play a key role in shaping those environments.
What you’ll be doing
You’ll join a collaborative data engineering squad working across multiple client projects — designing, developing, and optimising ETL pipelines and data models using Databricks, Spark, and Delta Lake. Expect a healthy mix of hands-on build work, problem-solving, and collaboration with analytics, platform, and architecture teams to ensure data is accessible, reliable, and insight-ready.
What we’re looking for
- A formal Databricks certification(s) in an engineering or development track
- Solid experience with Databricks, PySpark, and SQL
- Strong grounding in data lakehouse and Delta Lake principles
- Cloud expertise on Azure or AWS (bonus points if you’ve worked with both)
- Comfort building and maintaining ETL pipelines and production-grade data flows
- Understanding of CI/CD, version control (Git), and automation best practices
- Someone who enjoys variety – you’ll work across different industries and teams
Why you’ll love it
- Work with modern data stacks and diverse clients solving real business problems
- Hybrid setup in Dublin – 3 days a week on-site with a close-knit, collaborative team
- Growth environment: you’ll be supported to upskill in cloud, data architecture, and platform engineering
- A people-first culture that values initiative, learning, and genuine collaboration
If this sounds like you, click apply. Today!
Key Skills
Ranked by relevanceReady to apply?
Join Layke AI and take your career to the next level!
Application takes less than 5 minutes

