Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Job Description: Role Descriptions:
Design| build| and maintain batch and streaming ETL pipelines using Python| PySpark| and orchestration tools (e.g.| Airflow| AWS Step Functions| Glue workflows).
Strong hands-on experience in Python| PySpark| SQL| AWS| and ETL to build and optimize scalable data pipelines and warehouse solutions.
Work closely with Data Scientists| Analytics| and Business stakeholders to ensure reliable| high-quality data is available for reporting and advanced analytics
Develop optimized SQL for data modeling| transformations| and performance tuning across Data WarehousesLakes.
Implement robust data ingestion frameworks from APIs| files| and RDBMS manage schema evolution and partitioning strategies.
Build and maintain data models (star snowflake schemas| dimensional modeling) to support BIAnalytics and downstream ML workloads.
Ensure data quality (validations| profiling| observability| lineage) and implement error handling and recovery patterns.
Optimize PySpark jobs (shuffle management| partitioning| broadcast joins| caching) and SQL queries (explain plans| indexes| sort keys).
Collaborate with stakeholders to translate requirements into technical designs| document pipelines| schemas| and runbooks.
Maintain and automate UnixLinux scripts for jobs| monitoring| and data operations.
Uphold security and compliance (PII handling| encryption| role-based access| auditability).
UnixLinux scripting and operations are a plus.
Skills:
Digital : Amazon Web Service(AWS) Cloud Computing:Data Warehouse:Digital : Python for Data Science:Digital : PySpark
Key Skills
Ranked by relevanceReady to apply?
Join Astra-North Infoteck Inc. ~ Conquering today’s challenges, achieving tomorrow’s vision! and take your career to the next level!
Application takes less than 5 minutes

