Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Your Role
- Analyse large and complex datasets to identify trends, patterns, and insights that support business decision-making. Develop and optimize SQL queries, data models, and analytical datasets for reporting and advanced analysis.
- Design, build, and maintain BI dashboards and reports using enterprise BI tools. Work with cloud platforms (Azure, AWS, or GCP) to access, process, and analyse data. Perform data preparation, transformation and validation using Python, PySpark and Databricks.
- Collaborate with data engineering teams on data pipelines, data quality checks, and performance optimization. apply data modelling principles to ensure scalable and reusable analytical datasets. Support AI-driven analytics initiatives, including exploratory analysis and feature preparation.
- Work in Agile delivery environments, contributing to sprint planning, backlog refinement, and demos. Maintain documentation and governance artifacts using JIRA, Confluence, and DevOps tools.
- Strong proficiency in SQL for data analysis and data validation. Hands-on experience with Python for data analysis, transformation, and automation. Experience with BI tools (e.g., Power BI, Tableau, Looker, or similar).
- Practical experience with cloud platforms: Azure, AWS, or GCP (at least one required). Working knowledge of PySpark and Databricks for large-scale data processing.
- Solid understanding of data modelling concepts (dimensional modelling, fact/dimension tables). Good balance of Data Analytics, Data Engineering, Data Modelling, and Business Intelligence skills.
- Familiarity with Agile methodologies and tools such as JIRA and Confluence. Experience working with DevOps practices and CI/CD pipelines for data solutions.
- We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance.
- At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities.
- Equip yourself with valuable certifications in the latest technologies such as Generative AI.
Key Skills
Ranked by relevanceReady to apply?
Join Capgemini and take your career to the next level!
Application takes less than 5 minutes

