Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
The vacancy is open for remote or office-based work within Latvia.
4200-4700 EUR Gross salary range for Employment contract in Latvia.
Client
Our client is the world leader in online gaming services. Their business is developing in several directions, most prominent ones are a betting exchange and online games. The betting center and the bookmaker run on their own software. The company's web services have mobile versions and native applications for iOS and Android and are used by more than two million players from 50 countries.
Position overview
We are seeking an expert in designing, implementing, and operating stable, scalable, and cost-effective ETL solutions that extract data from production systems into our data platform. We want someone passionate about working with vast datasets who loves bringing data together to answer business questions and drive growth.
Responsibilities
- As part of our Data Platform team, you will lead data and process innovation through robust delivery of exceptional solutions.
- Transform raw data into useful, actionable data systems.
- Strive for efficiency by aligning data systems with business goals.
- Contribute to technology selection and ensure proper data design and implementation across projects.
- Demonstrate hands-on expertise in processing large datasets, data structure, access patterns, big data concepts, and cloud computing.
- Collaborate with Data Product Owners to understand data requirements and build ETL processes.
Requirements
- 4+ years of experience designing and implementing efficient ETL processes and data pipelines.
- Experience with streaming-based integration patterns and stream processing.
- Strong experience with Databricks platform on AWS (including workspaces, clusters, jobs, notebooks).
- Proficiency in PySpark, SQL, and Python for data processing.
- Hands-on experience with Delta Lake, Lakehouse architecture, and medallion design patterns (bronze/silver/gold).
- Broad familiarity with AWS technologies, especially Glue, Athena, EMR, Redshift, Lambda, and Kinesis.
- Involvement with CI/CD platforms and deployment patterns such as Jenkins, Ansible, shell scripting, unit/integration testing.
- Experience designing scalable data warehousing solutions.
- Familiarity with Agile software development and DevOps practices.
- Strong collaboration skills and ability to communicate effectively with both technical and business teams.
Key Skills
Ranked by relevanceReady to apply?
Join DataArt and take your career to the next level!
Application takes less than 5 minutes

