Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Requirements -
Cloud & Data Engineering
Design, build, and optimize data pipelines on GCP using BigQuery, Dataflow, Dataproc,cloud funtions, cloud run and Cloud Storage.
Implement ETL/ELT workflows for batch and streaming data ingestion.
Develop scalable data lake and warehouse architectures following best practices.
Optimize query performance through partitioning, clustering, compression, and schema design.(Big query, Bigtable , spanner and fire store)
Terraform , DBT knowledge is required with hand on experience
Data Processing & Modeling
Build real‑time streaming pipelines using Pub/Sub + Dataflow (Apache Beam).
Create dimensional models, star/snowflake schemas, and semantic layers.
Write efficient SQL and Python for transformations and automation.
Looker knowledge (very important)
Orchestration & Automation
Automate workflows using Cloud Composer (Airflow).
Implement CI/CD pipelines for data deployments using Cloud Build and Terraform integrated in GitHub
Manage versioning, testing, and environment configuration.
Security & Governance
Apply IAM role-based access, service accounts, VPC‑SC, and encryption policies.
Implement data quality checks and governance using Dataplex and validation frameworks.
Ensure compliance with GDPR and internal data security standards.(Encryption and decryption concepts and use of CMEK)
Operations & Monitoring
Set up monitoring dashboards and alerts using Cloud Monitoring & Logging.
Optimize resource usage to ensure cost‑efficient data processing.
Troubleshoot pipeline failures, performance bottlenecks, and data integrity issues.
Collaboration & Stakeholder Interaction
Partner with data analysts, ML teams, and product owners to define data needs.
Translate business requirements into scalable cloud architecture.
Document data flows, standards, lineage, and best practices for cross‑team adoption.
Key Skills
Ranked by relevanceReady to apply?
Join Ubique Systems and take your career to the next level!
Application takes less than 5 minutes

