Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Your Key Responsibilities
- Design, develop, and maintain secure, scalable, and reliable cloud infrastructure and architectures for custom and ready-made applications, ensuring operability and aligning with architectural best practices and standards while focusing on user needs.
- Establish and implement monitoring systems with alert mechanisms to proactively detect and resolve performance-related issues.
- Build and maintain data pipelines and data orchestration infrastructure to support data-driven use cases.
- Integrate various data sources into the data warehouse or lakehouse, ensuring data quality and consistency. If applicable, leverage Dremio for data lake query acceleration and semantic layer implementation
- Assume technical responsibility for applications and infrastructure throughout their lifecycle, collaborating with DevOps Engineers.
- Enhance the automation of operational processes, striving for self-service capabilities.
- Maintain and extend existing internal monitoring reports using Power BI, ensuring data visualization is effective and actionable.
- Collaborate as a contact person, advisor, and idea provider for the data teams regarding data technology and operational procedures.
- Assist customers with troubleshooting, documentation, training, and usability efforts, while managing customer tickets.
- Formulate and upkeep disaster recovery strategies and business continuity plans for vital applications and infrastructure.
- University degree in Computer Science, Information Technology, Data Engineering or a related field, coupled with equivalent professional experience.
- Several years (at least five) of relevant professional experience in data engineering, platform development, and IT infrastructure projects.
- Proven experience with automation and infrastructure as code principles, specifically using CloudFormation, Terraform, as well as CI/CD technologies like Jenkins, GitHub Actions, or Docker.
- Demonstrated competence in managing cloud environments and expertise in data-related cloud services within AWS (S3, Redshift, IAM-based security models, Glue, AWS Lake Formation) and Azure (Blob Storage, Delta Lake, ADF, Synapse), focusing on storage solutions and databases.
- Experience with data warehouse and data lakehouse architectures and technologies. Dremio knowledge and experience is a plus.
- Proficient programming skills in Python, with experience in frameworks like FastAPI; familiarity with JavaScript (NodeJS, TypeScript), Go, or other languages such as Java or C# is advantageous.
- Proficiency in Power BI for creating and maintaining internal monitoring reports and understanding of data visualization tools.
- Fluent English proficiency in both spoken and written forms; German is a plus.
- Enthusiastic about fostering direct interaction with users and collaborating closely with Data Analysts, Data Scientists, and Data Engineers.
- Strong analytical skills complemented by a proactive attitude and ability to monitor trends in data engineering.
- Understanding of analytical technologies and adeptness in ETL processes and tools, with interest in applying trends to data platforms.
- Ability to write well-formatted, structured, and clean code, comfortable with backend and DevOps technologies.
- Enthusiasm for debugging and command-line operations, paired with a passion for automation using Bash, Python, or JavaScript.
- Experience in agile methodologies, promoting an agile mindset throughout work activities.
- Onboarding & Integration: we help you settle in quickly through our onboarding program and a team that supports you;
- Health & Life Insurance: we provide private health plans and life insurance, so you can have peace of mind;
- Discounts: enjoy discounts from our partners on activities, trainings, goods etc.;
- Bookster Subscription: are you a passionate reader? Then you can borrow your favorite books from the Bookster Library;
- Professional Development: with our variety of projects and teams, you'll be able to learn new skills, develop and shape the career path that fits you best;
- Bonuses: performance bonuses reward your achievements, and thorough referral bonuses we thank you for helping our team grow;
- Gift for Special Occasions: celebrate Easter, 8th of March or the Winter Holiday;
- Vacation Days: we truly believe in long-term partnerships, and we're committed to supporting your well-being with a generous vacation policy from the start of your journey with us;
- Meal Tickets: we're happy to offer meal tickets as part of your everyday benefits, helping make your lunch breaks more enjoyable;
- Flexibility: you can balance your work and personal life with arrangements tailored to your job and responsibilities.
Exciting assignments and outstanding development opportunities await you because we impact the future with innovation. We look forward to your application.
#anunt-content p, #anunt-content li, #anunt-content strong {
font-size:13px !important;
line-height: 18px !important;
font-family: Arial !important;
margin-bottom:5px !important;
}
Key Skills
Ranked by relevanceReady to apply?
Join Hipo.ro and take your career to the next level!
Application takes less than 5 minutes

