Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Azure Data Engineer - Python/PySpark, SQL, Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, Azure Key Vault, ETL/ELT,
Locations: Stockholm or Malmö, with Helsingborg as an optional base
Work Model: Hybrid, 2–3 days per week onsite
Start: ASAP
Contract: 12 months (freelance)
Our client, a highly regarded technology company shaping modern data-driven products, is looking for an experienced Azure Data Engineer to join their team. This role centres on building robust, scalable, cloud-native data solutions that empower analytics, business decisions, and long-term digital strategy.
Key Responsibilities
Design and build data solutions
Create and maintain scalable data pipelines and cloud architectures using a wide range of Azure services.
Develop ETL/ELT processes
Design, manage, and optimise ETL and ELT workflows, particularly using Azure Data Factory and related tools.
Manage data storage
Implement and oversee data storage platforms such as Azure SQL Database, Azure Data Lake Storage (ADLS), and Azure Cosmos DB.
Ensure data quality
Apply validation, cleansing, and quality assurance methods to maintain strong data integrity across pipelines and models.
Optimise performance
Monitor, diagnose, and resolve performance issues across data pipelines, transformations, and databases.
Collaborate with stakeholders
Work alongside data scientists, analysts, engineering teams, and business stakeholders to understand data needs and deliver clear, scalable solutions.
Ensure data security
Implement access controls and security measures within the Azure ecosystem, leveraging services such as Azure Key Vault.
Required Skills & Experience
Azure expertise
Hands-on proficiency with Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, Azure Key Vault, and related cloud components.
Programming & scripting
Strong experience with Python, SQL, and PySpark for data transformation, automation, and analysis.
Data modelling
Ability to design, maintain, and optimise data models that ensure consistency, scalability, and performance.
Cloud & data warehousing concepts
Solid understanding of cloud data architectures, data warehousing principles, and modern data engineering patterns.
Analytical ability
Comfort working with large datasets, investigating complex data behaviour, and supporting analytical teams.
Key Skills
Ranked by relevanceReady to apply?
Join Empiric and take your career to the next level!
Application takes less than 5 minutes

