Data Engineer – PySpark – Airflow - Unix – Docker/Kubernetes - CI/CD – Finland
Empiric has received an exciting opportunity for an Data Engineer with good experience in PySpark, Airflow and proficiency in Unix. Docker and Kubernetes experience are also required.
The Data Engineer will troubleshoot data pipelines and address issues with real-time, batch processing.
Plus Develop ETL process and data flows using PySpark and Airflow. Maintain and troubleshoot CI/CD pipeline issues using GitHub Actions and JFrog
Hybrid 6 to 10 days in the clients Helsinki office initially. Can be reduced in time
Skills / Experience :
- PySpark & Airlow
- Unix
- Docker & Kubernetes
- CI / CD Tools: GitHub Actions
- Maintenance & troubleshooting
- Understanding & ensuring data quality, integrity, governance
·
- Term: 6 to 18 Months plus extensions
- Good day rate (B2B is fine) + Starter Bonus + Free Lunch Club experience
This is a critical position, please respond to this advert or reach out to Woody on either [email protected] or (whatsapp is fine) +44 7887 416 338 for a confidential chat and more details on rate and this terrific project.
Key Skills
Ranked by relevance
Related Jobs
3 roles aligned with this opportunity
Senior Software Engineer (NodeJS) - Greenfield Development, Remote Frist
2026-04-12
Backend Developer (m/w/d)
2026-04-12
Dev Ops Engineer
2026-04-12
- Posted
- Jul 01, 2025
- Type
- Contract
- Level
- Mid-Senior
- Location
- Helsinki Metropolitan Area
- Company
- Empiric
Industries
Categories
Related Jobs
3 roles aligned with this opportunity
Senior Software Engineer (NodeJS) - Greenfield Development, Remote Frist
2026-04-12
Backend Developer (m/w/d)
2026-04-12
Dev Ops Engineer
2026-04-12