Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Position: Data Engineers x 5 (Dremio Spark Airflow) - 100% Remote - 6 months Contract.
About The Role
We're expanding our telecom data platform team and are looking for talented Data Engineers to help design, build, and optimise large-scale data pipelines supporting analytics and automation across our telecom systems.
You'll play a key role in developing the next generation of our data integration and analytics framework, using technologies such as Apache Spark, Airflow, Kubernetes, and Dremio. This role is ideal for someone who enjoys working with distributed systems, complex data flows, and cutting-edge infrastructure.
Requirements
What You'll Do
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Integrate data from various telecom domains including CRM, billing, and network systems.
- Build efficient data transformation and orchestration workflows using Spark and Airflow.
- Manage deployments in Kubernetes on Linux-based environments (RHEL or equivalent).
- Optimise data accessibility and analytics performance through Dremio and related tools.
- Automate scheduling and monitoring to ensure stability and reliability.
- Troubleshoot data issues, conduct performance tuning, and maintain platform documentation.
- Proven experience in data engineering and ETL workflow design.
- Strong working knowledge of Apache Spark, Airflow, and Kubernetes.
- Solid programming skills in Python, Java, and SQL.
- Hands-on experience with Red Hat Enterprise Linux (RHEL) or similar environments.
- Strong analytical, problem-solving, and communication skills
- Experience working with telecom data (customer, billing, or network).
- Familiarity with data lake, mesh, or real-time streaming architectures.
- Exposure to cloud platforms such as AWS, Azure, or GCP.
- Understanding of CI/CD automation and data governance frameworks
- Fluent English
Key Skills
Ranked by relevanceReady to apply?
Join Belmont Lavan and take your career to the next level!
Application takes less than 5 minutes

