Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
- Development and maintenance on Enterprise Data Warehouses (EDW) and complex Business Intelligence Solutions (Data Lakes / Data Lakehouses).
- Design and development of data pipelines for scalable and reliable data workflows to transform extensive quantities of both structured and unstructured data.
- Data integration from various sources, including relational databases, APIs, data streaming services and cloud data platforms.
- Optimisation of queries and workflows for increased performance and enhanced efficiency.
- Writing modular, testable and production-grade code.
- Ensuring data quality through monitoring, validation and data quality checks, maintaining accuracy and consistency across the data platform.
- Elaboration of test programs.
- Document processes comprehensively to ensure seamless data pipeline management and troubleshooting.
- Assistance with deployment and configuration of the system.
- Participation in meetings with other project teams.
- B2 level of English minimum. Any level of French is an asset.
- Excellent knowledge of relational database systems applied to data warehouse, data warehouse design & architecture.
- Excellent knowledge of code-based data transformation tools such as Data build tool (dbt), Spark.
- Excellent knowledge of SQL.
- Excellent knowledge of data integration and ETL/ELT tools.
- Good hands-on experience as Data Engineer in a modern data platform and on data analytics techniques and tools.
- Good knowledge of programming in python
- Good knowledge of orchestration tools such as Airflow, Dagster.
- Good knowledge of data modelling tools.
- Good knowledge of online analytical data processing (OLAP) and data mining tools.
- Experience with data platforms such as Fabric, Talend, Databricks and Snowflake.
- Experience with containerised application development and deployment tools, such as Docker, Podman, Kubernetes
- Ability to participate in multilingual meetings
- Experience in working in a team, team spirit
- Ability to work with a high degree of rigour and method and, more specifically, to follow naming conventions and coding standards
- Capacity in preparing and writing clear and structured technical and user documents.
Key Skills
Ranked by relevanceReady to apply?
Join Thaleria and take your career to the next level!
Application takes less than 5 minutes