-
Empiric

Senior GCP Google Data Engineer

Empiric
Finland · Contract · Mid-Senior

Senior GCP Data Engineer - BigQuery, Dataflow (Apache Beam), Cloud Storage, Cloud Functions, Cloud Composer, Pub/Sub, and DataStream, Apache Spark, Kafka.


Location: Helsinki, Hybrid working

Contract Duration: 12 Months

Employment Type: Freelance Contract


We are seeking an experienced GCP Data Engineer/Developer to join a leading Telco client on a 12-month freelance assignment. The role involves developing, maintaining, and optimizing ETL/ELT pipelines on Google Cloud Platform (GCP) to support day-to-day business processes. You will work closely with Informatica experts to understand existing ETL logic, business rules, and source systems, translating them into scalable, secure, and efficient GCP-native solutions.


Key Responsibilities:

  • Develop and maintain ETL/ELT pipelines using GCP services including BigQuery, Dataflow (Apache Beam), Cloud Storage, Cloud Functions, Cloud Composer, Pub/Sub, and DataStream.
  • Collaborate with Informatica experts to understand ETL job logic, transformations, workflows, and business rules.
  • Design GCP-native equivalents for existing ETL jobs, ensuring scalability, performance, and cost efficiency.
  • Implement automation and deployment pipelines using CI/CD frameworks.
  • Configure and manage IAM roles, network policies, and monitoring solutions to ensure secure and compliant data environments.
  • Troubleshoot and resolve pipeline, workflow, and data storage issues.
  • Optimize pipelines and data processing workflows for performance and cost efficiency.
  • Work with orchestration tools like Cloud Composer (Apache Airflow) to manage complex workflows.


Essential Skills & Experience:

  • 6-8 years of experience in data engineering, with at least 3 years focused on GCP.
  • Hands-on experience with GCP core services: BigQuery, Dataflow, Cloud Storage, Cloud Functions, Cloud Composer.
  • Strong background in ETL/ELT pipelines, data integration, and workflow orchestration.
  • Knowledge of relational databases, data warehousing, and schema design in GCP environments.
  • Proficiency in Python, Bash, or similar scripting languages for automation and troubleshooting.
  • Familiarity with Apache Spark, Kafka, or other big data technologies is a plus.
  • Experience in setting up IAM roles, network configurations, and monitoring solutions in GCP.


Desirable Skills:

  • Expertise in Kubernetes (GKE) and container orchestration.
  • Familiarity with Terraform, Ansible, or other Infrastructure as Code tools.
  • Experience with CI/CD tools such as Jenkins, GitHub Actions, GitLab CI/CD, or Cloud Build.
  • Hands-on experience with monitoring tools like Prometheus, Grafana, or Stackdriver.
  • Google Cloud Professional Cloud Architect or Cloud Engineer certification preferred.
  • Knowledge of multi-cloud or hybrid cloud environments.
  • Exposure to data migration and transformation projects.
  • Strong understanding of networking concepts, security best practices, and IAM policies.
  • Excellent problem-solving skills and ability to collaborate in a team environment.

Key Skills

Ranked by relevance

cloud gcp storage apache composer etl spark infrastructure as code data warehousing prometheus big data jenkins grafana gitlab bash cicd
Login to Apply
Posted
Feb 11, 2026
Type
Contract
Level
Mid-Senior
Location
Helsinki Metropolitan Area
Company
Empiric

Industries

Technology Information Media

Categories

Information Technology

Related Jobs

3 roles aligned with this opportunity

View all jobs
View Job Details
AgileGrid Solutions
Related

Frontend Developer -React

2026-04-12

Full-time
Associate
United States
Technology
Information Technology
View Job Details
Morgan Consulting
Related

Java Software Engineer

2026-04-12

Full-time
Mid-Senior
Australia
Information Services
Information Technology
View Job Details
Quik Hire Staffing
Related

Software Engineer (Remote)

2026-04-11

Part-time
Not Applicable
Finland
Technology
Information Technology