Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
- Initial 9 Month Contract | Potential For Extensions
- Clayton Location | 3 Days On-Site & 2 Days WFH
- Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)
The Role: The Senior Data Analyst is responsible for safeguarding the system-agnostic business information models by modelling and stitching data within the existing enterprise data ecosystem, ensuring consistency and continuity.
The Responsibilities:
- Develop and maintain advanced data transformations and analytical workflows using SQL, Python, and Databricks, operating on large-scale datasets within a Lakehouse (Delta Lake / Spark) architecture
- Design and document business-aligned data models using best practice modelling principles.
- Contribute to the definition and implementation of internal data modelling standards across the team.
- Design and build well-structured, efficient reports and dashboards in Power BI that are tailored to business needs and based on trusted, modelled datasets.
- Investigate complex data problems and independently prototype and implement analytical solutions within Databricks
- Conduct data profiling, validation, and quality remediation to ensure accuracy, completeness, and consistency
Skills & Experience Required:
- Minimum 7 years of experience as a Data Analyst delivering high-quality analytics using SQL and Python within Databricks.
- Strong understanding of Lakehouse architecture, particularly working with large-scale structured datasets.
- Demonstrated ability to build and maintain robust data models and curated datasets to support reporting and analysis.
- Experience working with semantic modelling tools (e.g. Lucidchart or equivalent) to visualise data models.
- Extensive hands-on experience designing, building, and maintaining Power BI dashboards aligned with best practices.
What's in it for you:
- Initial 9 Month Contract | Potential For Extensions
- Clayton Location | 3 Days On-Site & 2 Days WFH
- Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)
Apply today and Peter Li will reach out to disclose further information.
Key Skills
Ranked by relevanceReady to apply?
Join Talent and take your career to the next level!
Application takes less than 5 minutes

