Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Key Responsibilities
- Diagnose and resolve data issues, production outages, and performance bottlenecks to ensure system reliability.
- Debug data job failures, including pipeline breakdowns and unexpected changes in row-level data.
- Optimize database queries and monitor database health to enhance efficiency and scalability.
- Develop automated workflows for data regeneration to maintain accuracy and consistency.
- Enable secure and efficient data access for analysis, balancing openness with compliance requirements.
- Stay updated on industry trends and recommend best-in-class data engineering technologies and practices.
- Collaborate with the Data Lead and engineering team to implement optimal data ingestion and storage strategies.
- Design and build scalable, robust, and maintainable data pipelines using cutting-edge technologies in consultation with the Data Lead.
Requirements
- Minimum of 4 years of relevant working experience preferred.
- Strong proficiency in Python, with extensive hands-on experience using Pandas for data manipulation, transformation, and analysis of large datasets.
- Expertise in SQL performance tuning, database optimization, and query efficiency.
- Hands-on experience developing, deploying, and debugging robust data pipelines and ETL/ELT workflows in production environments.
- Experience with at least one cloud platform (AWS, GCP, or Azure) and data warehousing solutions.
- Strong problem-solving skills, particularly in navigating ambiguous or unknown data issues.
- Excellent communication skills with the ability to collaborate effectively with engineering and product teams.
Key Skills
Ranked by relevanceReady to apply?
Join HCLTech and take your career to the next level!
Application takes less than 5 minutes

