Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Job Title: Data Platform Migration Consultant (GCP)
About the Assignment:
We are looking for an experienced consultant to support the modernization and migration of an existing on-premise data platform to a modern, cloud-native architecture on Google Cloud Platform (GCP). The current data lake is based on Apache technologies such as Spark, NiFi, and Airflow, and the goal is to build a future-proof, compliant, and scalable cloud data foundation leveraging containerization, managed GCP services, and infrastructure-as-code.
You will work as part of a DataOps team, contributing directly to delivery and helping establish a sustainable and high-performing cloud data environment.
Key Responsibilities:
Support the migration of data pipelines and workflows from on-premise (Cloudera stack) to GCP.
Design and implement scalable solutions using Dataproc, Docker/Kubernetes, and Terraform.
Develop and maintain ETL processes, data pipelines, and CI/CD workflows.
Optimize performance and ensure compliance, observability, and automation across the platform.
Collaborate closely with cross-functional teams to ensure smooth integration and delivery.
Required Skills:
Apache Spark
Cloud experience (preferably GCP)
Containerization (Docker, Kubernetes)
Infrastructure as Code (Terraform)
CI/CD pipelines
Python and SQL
Database management and ETL processes
Meritorious Skills:
Experience with on-premise stack: Apache NiFi, Apache Airflow, Cloudera ecosystem
Experience with GCP services (Dataproc, Event Hubs, etc.)
Scala (bonus, especially for Spark)
Observability and monitoring tools (Elasticsearch, Kibana)
YAML, Linux, Windows
Command-line interface (CLI) proficiency
Soft Skills:
Flexible and delivery-oriented mindset
Strong communication skills in English (Swedish is a plus)
Team-oriented, proactive, and structured in approach
Key Skills
Ranked by relevanceReady to apply?
Join Veritaz and take your career to the next level!
Application takes less than 5 minutes