Gazelle Global
GCP Data Engineer
Gazelle GlobalFinland10 days ago
ContractInformation Technology

We are Hiring an experienced GCP Data Engineer to join our team in Helsinki, Finland. The ideal candidate will have strong hands-on experience in Google Cloud Platform (GCP) services and data engineering practices, with a proven track record of building scalable and robust data pipelines in cloud environments.

This role involves designing, developing, and maintaining ETL/ELT pipelines while ensuring secure, compliant, and high-performing data processing solutions within GCP.

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT data pipelines using GCP services.
  • Build and optimize data solutions using BigQuery, Dataflow (Apache Beam), Cloud Storage, Cloud Functions, and Cloud Composer.
  • Implement workflow orchestration using Cloud Composer (Apache Airflow) or similar tools.
  • Configure and manage IAM roles, policies, networking, and monitoring solutions within GCP.
  • Ensure high availability, performance, and data integrity across cloud environments.
  • Troubleshoot and resolve issues related to data pipelines, workflows, and storage configurations.
  • Design and implement efficient schema models for data warehousing solutions.
  • Automate processes using scripting languages such as Python and Bash.
  • Collaborate with cross-functional teams including data scientists and DevOps engineers.

Required Skills & Experience

  • 4+ years of experience as a Data Engineer with strong focus on Google Cloud Platform.
  • Hands-on experience with:
  • BigQuery
  • Dataflow (Apache Beam)
  • Cloud Storage
  • Cloud Functions
  • Cloud Composer (Apache Airflow)
  • Experience building and maintaining ETL/ELT pipelines.
  • Strong knowledge of relational databases and data warehousing concepts.
  • Experience configuring IAM roles, policies, and network settings in GCP.
  • Proficiency in Python and/or Bash scripting.
  • Strong troubleshooting and analytical skills.

Preferred Qualifications

  • Google Cloud Professional Data Engineer certification (or other relevant GCP certifications).
  • Experience with machine learning pipelines, Vertex AI, or MLOps.
  • Familiarity with Apache Spark, Kafka, or other big data technologies.

Key Skills

Ranked by relevance