Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
GCP Data Engineer (BigQuery, Dataflow, Composer)
Role Description:
- Design and build scalable, secure, and high-performance data pipelines on GCP.
- Develop and optimize ETL/ELT workflows using Cloud Composer, Dataflow, Dataproc, and BigQuery.
- Implement data ingestion frameworks for batch and streaming data (Pub/Sub, Kafka, Dataflow).
- Model, partition, and optimize datasets in BigQuery for analytics use cases.
- Collaborate with data scientists, architects, and business teams to deliver end-to-end data solutions.
- Ensure data quality, reliability, and robustness through monitoring, validation, and automation.
- Implement CI/CD pipelines for data workflows using Cloud Build, Git, and Terraform.
- Optimize cost, performance, and scalability across GCP data services.
- Ensure security best practices, IAM policies, and compliance with organizational standards.
Skills:
- Digital: Big Data and Hadoop Ecosystems
- Digital: Google Data Engineering
Key Skills
Ranked by relevanceReady to apply?
Join Astra-North Infoteck Inc. ~ Conquering today’s challenges, achieving tomorrow’s vision! and take your career to the next level!
Application takes less than 5 minutes

