Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
DevOps, Azure Databricks, and modern Infrastructure-as-Code (IaC) practices using Bicep or Terraform.
The candidate collaborates closely with our client's Data Engineers, Security Team and Platform Teams to ensure smooth development workflows, automated deployments, and securely governed cloud environments all according to our client's Quality standards.
Key Responsibilities
CI/CD Pipeline Engineering (Azure DevOps)
- Design, develop, and maintain Azure DevOps pipelines for data
- processing workflows, ML model training, and Databricks deployments.
- Implement pipeline quality gates, automated testing, environment
- promotion strategies, and artifact management.
- Ensure pipelines are resilient, observable, and aligned with
- organizational standards.
- Ensure the Software Development Lifecycle is built up according to the
- client's Policies and meet their requirements.
- Build, maintain, and standardize cloud infrastructure using Bicep or Terraform for Azure resources such as Databricks Workspaces, Storage Accounts, Key Vaults, Networks and containerized where applicable
- Ensure infrastructure is modular, reusable, and compliant with enterprise security and governance requirements.
- Develop automation for recurring operational tasks (orchestration, monitoring, environment provisioning) using PowerShell and Python.
- Create scripts supporting Databricks job deployments, cluster lifecycle management, ML model registration, and data workflow automation.
- Build and maintain deployment mechanisms for Databricks notebooks/Jobs, workflows, ML models, and Delta pipelines.Support ML lifecycle automation, including data validation, model packaging, model registry updates, and automated retraining pipelines.- Collaborate with Data Science teams to operationalize machine learning workflows in production environments.
- Ensure high availability, scalability, and reliability of data and ML workloads in Azure.
- Implement monitoring & alerting for pipelines, clusters, and data workflows.
- Contribute to operational improvements and proactive issue prevention.
- Work closely with cross-functional teams and participate in review processes for IaC, pipeline changes, and data platform enhancements.
- Ensure changes follow standardized processes as outlined in the client's Quality Handbook
- Document designs, processes, and architecture diagrams to support transparency and long-term maintainability.
Technical Skills
- Azure DevOps pipelines (YAML), environments, approvals, artifacts
- Infrastructure as Code: Bicep and/or Terraform and/or Pulumi Scripting Languages: PowerShell and Python
- Strong understanding of Azure Databricks, Spark fundamentals, and Databricks deployment patterns
- Experience with Azure Core Services: Key Vault, Storage, VNet, AAD, Monitor, AKS (optional)
- Familiarity with containerization, Git branching strategies, and DevOps best practices
- Experience with MLOps frameworks (MLflow, Databricks Model Registry)
- Experience deploying large-scale data or ML workloads
- Knowledge of cloud security best practices and networking in Azure
- Strong communication and ability to collaborate across data, engineering, and infrastructure teams
- Analytical mindset with a passion for automation and continuous improvement
- Ability to troubleshoot complex distributed systems and data pipelines
- Rate offered: £450-475 per day
- IR35 Status: Outside
- Location: Remote
- Start date: March '26
- Duration: 3 months initial sign up with significant opportunity for extension.
Key Skills
Ranked by relevanceReady to apply?
Join ShareForce and take your career to the next level!
Application takes less than 5 minutes

