Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Role: Snowflake Developer
Location: Poland
100% Remote
B2B Contract
Primary Skillset:
• Data Modeling & ETL: Design, develop, and optimize data models and ETL processes using Snowflake for efficient data storage and analytics.
• Design and implement end-to-end ETL pipelines for loading data from various sources into Snowflake.
• Utilize Snowflake’s built-in features such as tasks, streams, and Snowpipe to automate the ETL process for continuous and batch data loads.
• Implement data transformation logic using SQL, Snowflake Stored Procedures (SQL and Python), and ETL tool to ensure the integrity, accuracy, and consistency of data.
• Optimize data loads and transformations for scalability and performance using Snowflake’s micro-partitioning and clustering features.
• Optimize and tune Snowflake queries, Data Storage, Warehouse for performance and efficiency.
DevOps Integration - Secondary
• Azure DevOps Experience: Proficiency with Azure DevOps for:
• Deploying Snowflake scripts across different environments
• Managing ETL pipeline deployments
• Proficiency in DevOps tools and best practices, with experience deploying Snowflake and ETL tool services.
Qualifications:
• Proven experience as a Snowflake Developer with hands-on experience in Snowflake data warehousing solutions.
• Data Architecture: Strong understanding of Snowflake platform features, including micro-partitioning, file processing from AWS S3, and data quality practices.
• Expertise in writing and optimizing SQL queries, including complex queries, CTEs, and stored procedures (using JavaScript within Snowflake).
• Solid experience in working with Snowflake’s micro-partitioning, file processing from AWS S3, and optimizing data models.
• Strong knowledge of Python for data engineering tasks and automation.
Key Skills
Ranked by relevanceReady to apply?
Join Ampstek and take your career to the next level!
Application takes less than 5 minutes

