Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Hello,
I hope you are doing good.
Position – GCP Data Engineer
Location – Helsinki, Finland (Onsite)
Experience - 8+ Years
Duration - 6 months B2B Contract, It's extendable.
Job Description
8 years of experience as a Data Engineer with a focus on cloud GCP cloud platform. Strong hands-on experience with GCP services such as BigQuery, Dataflow (Apache beam), Cloud Storage, Cloud Functions, Cloud Composer.Experience in building ELT pipelines and working with data integration frameworks.Proficiency in setting up IAM roles and policies, network configurations, and monitoring solutions to ensure robust and compliant data processing environments in GCP.Hands-on experience with Cloud Composer (Apache Airflow) or other orchestration tools to manage complex workflows.Deep knowledge of relational databases, data warehousing, and schema design in GCP environments.Troubleshoot and resolve issues related to data pipelines, workflows, and data storage configurations. Proficiency in scripting languages (e.g., Python, Bash) for automation and troubleshooting tasks. Familiarity with Apache Spark, Kafka, or other big data tools is a plus.We are seeking an experienced GCP ExpertDeveloper to join a Telco client for developing and maintaining the ETL jobs in GCP for day-to-day business processes. The candidate will collaborate closely with Informatica experts to understand existing ETL job logic, business rules(existing GCP jobs or new requirements), and source systems, and translate them into efficient, scalable GCP-native solutions.Design and implement equivalent data pipelines using GCP services such as BigQuery, DataStream, Dataflow, PubSub, Cloud Storage etcOptimize pipelines for performance, scalability, and cost-efficiencyWork with Informatica specialists to understand ETL mappings, transformations, and workflowsEnsure all pipelines meet security, privacy, and compliance requirements.Assist in automating pipeline deployment using CICD frameworks.Strong expertise in Google Cloud Platform (GCP) and its core services.• Hands-on experience with Kubernetes (GKE) and container orchestration.• Proficiency in Terraform, Ansible, or other Infrastructure as Code tools.• Familiarity with CI/CD tools like Jenkins, GitHub Actions, GitLab CI/CD, or Cloud Build.• Experience with monitoring tools like Prometheus, Grafana, or Stackdriver.• Google Cloud Professional Cloud Architect or Cloud Engineer Certification.• Experience with multi-cloud or hybrid cloud environments.• Exposure to migrations & transformations.• Strong scripting skills in Python, Bash, or PowerShell.• Knowledge of networking concepts, security best practices, and IAM policies.• Excellent problem-solving skills and ability to work in a team environment.
Key Skills
Ranked by relevanceReady to apply?
Join Ubique Systems and take your career to the next level!
Application takes less than 5 minutes

