Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Requirements
- Solid experience in Databricks, including hands-on data engineering and analytics
- Strong proficiency in SQL and Python/PySpark
- Experience with cloud platforms (AWS, Azure, or GCP)
- Previous experience in similar projects (e.g., Abbot or Bayer) is a plus
- Design, develop, and maintain data pipelines and workflows in Databricks
- Work with large datasets, performing ETL and data transformation tasks using SQL and Python/PySpark
- Collaborate with the team to implement cloud-based data solutions and optimize performance
- Support data analytics and reporting requirements as needed
- English level B2+ (Upper-Intermediate or higher)
- Europe (CET)
- Monthly salary in USD
- 100% remote work opportunity
- 15 working days of paid vacation per year (available after 6 months)
- Up to 10 national holidays (based on the project team location or the client's country)
- 5 paid sick leave days
- Health insurance reimbursement (up to $1,000 gross per year)
Key Skills
Ranked by relevanceReady to apply?
Join inbybob_ and take your career to the next level!
Application takes less than 5 minutes