Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
DATA ENGINEER
The Data Engineer is an important player in organization-wide data transformation projects. You will be responsible for developing and automating data processing pipelines for data modeling, analysis, and reporting from various data sources. The primary responsibility of this role is to help set up the Delta Lake architecture and deliver data-driven solutions.
Job Description
- Build data flows for data acquisition, aggregation, and modeling, using both batch and streaming paradigms.
- Develop high-quality code for the core data stack, including the data integration hub, data warehouse, and data pipelines under the Azure environment.
- Collaborate with other developers as part of a SCRUM team to ensure overall team productivity.
- Perform data analysis, data profiling, data cleansing, data lineage, data mapping, and data transformation.
- Assist in designing, developing, documenting, and implementing end-to-end data pipelines and data-driven solutions.
- Provide technical support for data-related issues with recommendations and solutions.
- Critically analyze information needs.
- Help define KPIs, set up monitoring, and implement alerting at the data level.
Profile
- You hold a Bachelor’s or Master’s degree.
- You have at least 3 years of experience in a similar role.
- You are strong in detailed analysis but can also translate your findings into clear, practical synthesis and implementation.
- Knowledge and/or experience with the following:
- Must have: Microsoft Azure Data Platform (Azure Delta Lake, Databricks, Azure Data Factory, Event Hub, Debezium)
- Must have: Expertise in building ETL and data pipelines on Databricks using data engineering languages such as Python and SQL on Azure On-Premise Microsoft SQL Server (including Transact-SQL, stored procedures, Analysis Services, indexing, etc.)
- Must have: Azure DevOps – CI/CD implementation, automation of Azure Data Factory deployments
- BI concepts and implementation — preferably star schema modeling
- You have a good understanding of the Microsoft Fabric suite and can critically compare this technology with alternatives such as Databricks within a modern data architecture.
- You can work independently.
- You are collegial and contribute to team thinking.
- You are capable of taking a critical perspective and discussing substantive topics (related to management information) at all levels within the organization.
- You handle deadlines and priorities well and work in an Agile way.
Must be fluent in French
Key Skills
Ranked by relevanceReady to apply?
Join Koda Staff and take your career to the next level!
Application takes less than 5 minutes