Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
We are Hiring an experienced GCP Data Engineer to join our team in Helsinki, Finland. The ideal candidate will have strong hands-on experience in Google Cloud Platform (GCP) services and data engineering practices, with a proven track record of building scalable and robust data pipelines in cloud environments.
This role involves designing, developing, and maintaining ETL/ELT pipelines while ensuring secure, compliant, and high-performing data processing solutions within GCP.
Key Responsibilities
- Design, develop, and maintain scalable ETL/ELT data pipelines using GCP services.
- Build and optimize data solutions using BigQuery, Dataflow (Apache Beam), Cloud Storage, Cloud Functions, and Cloud Composer.
- Implement workflow orchestration using Cloud Composer (Apache Airflow) or similar tools.
- Configure and manage IAM roles, policies, networking, and monitoring solutions within GCP.
- Ensure high availability, performance, and data integrity across cloud environments.
- Troubleshoot and resolve issues related to data pipelines, workflows, and storage configurations.
- Design and implement efficient schema models for data warehousing solutions.
- Automate processes using scripting languages such as Python and Bash.
- Collaborate with cross-functional teams including data scientists and DevOps engineers.
Required Skills & Experience
- 4+ years of experience as a Data Engineer with strong focus on Google Cloud Platform.
- Hands-on experience with:
- BigQuery
- Dataflow (Apache Beam)
- Cloud Storage
- Cloud Functions
- Cloud Composer (Apache Airflow)
- Experience building and maintaining ETL/ELT pipelines.
- Strong knowledge of relational databases and data warehousing concepts.
- Experience configuring IAM roles, policies, and network settings in GCP.
- Proficiency in Python and/or Bash scripting.
- Strong troubleshooting and analytical skills.
Preferred Qualifications
- Google Cloud Professional Data Engineer certification (or other relevant GCP certifications).
- Experience with machine learning pipelines, Vertex AI, or MLOps.
- Familiarity with Apache Spark, Kafka, or other big data technologies.
Key Skills
Ranked by relevanceReady to apply?
Join Gazelle Global and take your career to the next level!
Application takes less than 5 minutes

