BGTS is a software and technology solutions company with over 1,800 professionals and 25+ years of experience. Through engineering expertise and industry insight, our international offices deliver tailored solutions, enabling clients worldwide to achieve their business goals with flexibility, speed, and impactful results.
Location: Remote
Employment Type: Long Term
About the Role
We are looking for a skilled and passionate Data Engineer to join and strengthen the Data Engineering team of our leading client from the UK. In this role, you will play a critical part in building and maintaining modern, scalable, and high-performing data platforms using Azure Databricks and Lakehouse architecture.
You will work on cutting-edge technologies and collaborate closely with data science, analytics, and platform teams to unlock the true value of data.
Key Responsibilities
As a Data Engineer, you will be responsible for:
Data Engineering & Development
- Design, build, and maintain high-quality, scalable, and tested data pipelines.
- Develop and manage Databricks structured streaming pipelines.
- Build and optimize event-driven and real-time data processing solutions.
- Implement and maintain Unity Catalog-based Lakehouse architecture.
- Develop analytics-ready datasets to support business insights and reporting.
Platform & Automation
- Build and manage CI/CD pipelines using Azure DevOps.
- Identify and implement automation opportunities across workflows.
- Ensure reliable and stable data platform operations.
- Apply governance, security, and documentation standards.
Data Quality & Reliability
- Establish the Data Lakehouse as a trusted and reliable source of truth.
- Monitor, troubleshoot, and resolve data incidents.
- Support business users and technical teams with data-related queries.
- Continuously improve platform performance and reliability.
Collaboration & Support
- Work closely with data science, analytics, platform, and business teams.
- Champion data engineering best practices.
- Provide technical guidance and mentorship where required.
- Contribute to a culture of learning, quality, and continuous improvement.
Core Behaviours & Values
We are looking for someone who demonstrates:
✔ Think Impact
- Focuses on outcomes and business value.
- Prioritizes effectively and avoids low-impact activities.
✔ Own It
- Takes responsibility for delivery.
- Proactively removes blockers and manages stakeholders.
✔ Back Each Other
- Works collaboratively and supports colleagues.
- Encourages open and honest communication.
✔ Push for Better
- Continuously seeks improvement.
- Challenges the status quo and embraces innovation.
Technical Skills & Experience
Essential Skills
- Strong experience with Azure Databricks and cloud data platforms.
- Advanced proficiency in Python, PySpark, and SQL.
- Experience developing Spark/Databricks pipelines.
- Hands-on experience with structured streaming and event-driven systems.
- Strong understanding of Lakehouse architecture and best practices.
- Experience with Unity Catalog.
- Expertise in Azure DevOps and CI/CD pipelines.
- Knowledge of data modelling (dimensional/star schemas).
- Experience working in Agile environments.
Desirable Skills
- Exposure to multiple data technology stacks.
- Experience in large-scale enterprise environments.
- Knowledge of security, governance, and compliance frameworks.
Collaboration & Communication
You will be expected to:
- Partner with cross-functional teams to deliver integrated data solutions.
- Build strong relationships and become a trusted team member.
- Communicate clearly and professionally.
- Balance fast delivery with long-term scalability and maintainability.
- Stay updated with industry trends and emerging technologies.
Technical Assurance & Quality
- Ensure all solutions are fit for purpose and aligned with business strategy.
- Apply best practices in architecture, performance, and security.
- Continuously optimize platform efficiency.
- Maintain high engineering standards without being overly rigid.
Conduct & Compliance
In line with regulatory requirements, you must:
- Act with integrity, diligence, and professionalism.
- Treat customers fairly and ethically.
- Cooperate with regulatory bodies.
- Follow proper market conduct standards.
- Deliver positive outcomes for end users.
Education & Qualifications
- Bachelor’s degree in computer science, Engineering, or related field (preferred but not mandatory).
- Relevant certifications in cloud/data technologies are a plus.
Why Join:
- Work with cutting-edge Databricks and Lakehouse technologies.
- Be part of a high-impact data transformation journey.
- Collaborate with talented engineers and analysts.
- Opportunity for continuous learning and career growth.
- Contribute to building a future-ready data ecosystem.
Key Skills
Ranked by relevance
Related Jobs
3 roles aligned with this opportunity
Agentic AI Engineer
2026-04-09
Data Analytics & Power BI Senior Specialist
2026-02-04
Graduate Software Engineer
2026-04-11
- Posted
- Feb 10, 2026
- Type
- Full-time
- Level
- Mid-Senior
- Location
- Türkiye
- Company
- BGTS
Industries
Categories
Related Jobs
3 roles aligned with this opportunity
Agentic AI Engineer
2026-04-09
Data Analytics & Power BI Senior Specialist
2026-02-04
Graduate Software Engineer
2026-04-11