Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
JOB DESCRIPTION SUMMARY
This role will be responsible for designing, developing, and maintaining scalable data pipelines, ensuring data flow across systems, and optimizing data infrastructure. You will collaborate with cross-functional teams, assist with production support, and drive process optimizations to ensure a seamless data ecosystem for business operations. Strong Databricks experience is a must.
KEY DUTIES & RESPONSIBILITIES
- Data Pipeline Development: Design, build, and maintain robust data pipelines using Azure Data Factory (ADF), Azure Databricks, ADLS Gen 2 and PySpark, ensuring efficient data collection, processing, and storage from various sources.
- DataBricks Expertise: Have extensive experience working with DataBricks; including resource provisioning, understanding tools and modules available, and being able to share knowledge and expertise in DataBricks to contribute to optimization of this platform.
- Data Integration: Integrate data from multiple structured and unstructured sources, including APIs, third-party systems, and databases (SQL, Oracle), ensuring smooth data flow across the ecosystem.
- Data Warehousing: Design, develop, and optimize data lakes and warehouses on platforms like Databricks, Azure Synapse and ADLS, supporting analytics and reporting needs.
- ETL/ELT Development: Develop and optimize ETL processes for efficient data extraction, transformation, and loading, using tools like ADF and custom pipelines.
- Process Optimization: Automate and streamline data workflows (incremental loads, email notifications, pipeline runs) to enhance system performance and reduce costs.
- Production Support & Resolution: Troubleshoot and resolve production failures within SLAs, ensuring minimal downtime and high data availability. Provide support during business hours and extended hours, (and if critical P1 failures on weekends in rotation with others), as required.
- Go-Live & Project Support: Support the successful migration and implementation of new code and solutions. Utilizing Azure DevOps CI/CD.
- Team Management: Lead and manage vendor staff, ensuring smooth execution of projects or operations support as assigned. Ensuring SLAs or project milestones are met on time, and providing mentorship to junior team members.
- Innovation & Optimization: Design scalable, high-performance data architectures and promote innovation, cost-efficiency, and operational excellence.
- Documentation & Knowledge Transfer: Create and maintain clear documentation for processes, operations, and technical solutions, including knowledge transfer documents and data catalogue updates.
- Monitoring & System Oversight: Monitor system performance, identify opportunities for optimization, and ensure platform scalability to meet future needs.
EDUCATION & SKILLS
- Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field
- Proven experience working with cloud-based infrastructure, particularly Azure
- Proven experience working with DataBricks.
- Experience with containerization tools (Docker, Kubernetes) and CI/CD pipelines
- Proficient in English
- Preferred knowledge of machine learning workflows and model deployment
- Preferred proficiency in Arabic
EXPERIENCE & KNOWLEDGE
- 6+ years of experience in data engineering, production support, and pipeline management
- Cloud Platforms: ADLS Gen 2, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and DevOps, Azure Key Vault
- Programming/Querying: Python, PySpark, SQL
- Automation & Monitoring: Custom automation tools, failure email triggers, performance monitoring solutions
- Preferred experience in project management
- Excellent leadership, communication, and problem-solving skills is a plus
ABILITIES & SPECIFIC REQUIREMENTS
- Handles diverse tasks and meets deadlines under pressure
- Works independently and responsibly
- Skilled negotiator with vendors
- Executes clear strategies aligned with corporate goals
- Communicates effectively with technical and non-technical audiences
- Leads professionally and fosters teamwork
- Manages change and adapts quickly with critical thinking
- Applies systematic problem-solving and agile decision-making
- Promotes collaboration, innovation, transparency, and accountability
Key Skills
Ranked by relevanceReady to apply?
Join Etihad Credit Bureau and take your career to the next level!
Application takes less than 5 minutes