Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Design and optimize cutting-edge data pipelines and warehouse solutions using Snowflake and DBT in a fully remote role across Québec or Ontario. This permanent opportunity offers a salary of $82–90K (negotiable based on experience) and the chance to work in a dynamic, cloud-based environment with strategic impact.
What is in it for you:
• Salary starting at $82.000 (negotiable based on experience).
• Annual bonus based on individual performance and company profitability, paid at the end of the fall.
• Permanent full-time position (40 hours/week), Monday to Friday, between 8 am and 5 pm.
• 3 weeks of vacation per year, depending on seniority.
• Comprehensive benefits package available after 90 days: dental and medical insurance, massage therapy, chiropractic care, and more.
• Retirement savings plan: voluntary contribution of up to 3% of salary, with matching employer contribution.
Responsibilities:
• Design, build, and maintain data pipelines, warehouses, and data models using Snowflake and DBT.
• Collaborate with cross-functional teams to gather data requirements and develop efficient data architectures.
• Implement and manage ETL/ELT processes across structured and unstructured data sources using tools such as Azure Data Factory and SQL.
• Enforce data governance protocols including quality, lineage, metadata management, and security compliance.
• Monitor system performance, conduct tuning, and proactively address bottlenecks.
• Maintain documentation of data processes, architecture, and technical specifications.
• Contribute to team knowledge by supporting peers and staying current on data engineering trends.
What you will need to succeed:
• Bachelor's or graduate degree in computer engineering, data science, mathematics, or a related discipline.
• Relevant certifications in Azure Data Services or Snowflake are considered an asset.
• 4–6 years of experience in data engineering or a related field.
• Proficient in SQL and familiar with both relational and NoSQL databases (e.g., MS SQL Server, Snowflake, PostgreSQL, Cosmos DB).
• Hands-on experience with Snowflake and DBT for warehousing and data transformation.
• Skilled in designing and optimizing data pipelines and ETL/ELT workflows.
• Experience with cloud platforms, particularly Azure, and cloud-based storage systems.
• Familiarity with data pipeline and orchestration tools such as Azure Data Factory, Airflow, Azkaban, or Luigi.
• Experience leveraging REST APIs for data integration.
• Comfortable working in multidisciplinary teams to address complex data processing challenges.
• English and French to support data governance, documentation, and collaboration across teams in both languages.
Why Recruit Action?
Recruit Action (agency permit: AP-2504511) provides recruitment services through quality support and a personalized approach to job seekers and businesses. Only candidates who match hiring criteria will be contacted.
# GE220725
Key Skills
Ranked by relevanceReady to apply?
Join Recruit Action inc. and take your career to the next level!
Application takes less than 5 minutes

