Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
DataOps Engineer
The group, operating internationally and active in the services sector, is looking for a DataOps Engineer.
Key Responsibilities
- Collaborate with Data Engineers, DevOps and Architects teams to design, deploy and operate scalable and reliable data infrastructures supporting data ingestion, analytics and AI projects;
- Build, automate and manage data platform environments (data lakes, data warehouses, streaming systems) leveraging AWS services and Infrastructure as Code practices;
- Implement and maintain CI/CD pipelines for data workflows, ensuring high availability, observability, and security across all environments;
- Develop monitoring, logging, and alerting systems to ensure performance, reliability, and cost optimization of data workloads;
- Contribute to the evolution of a data-centric culture by enabling fast, safe, and repeatable deployment of data solutions;
- Work within an Agile team with a collaborative mindset, contributing to continuous improvement of processes, automation, and platform reliability.
Required Skills
- Strong experience with AWS services (e.g. S3, Glue, ECS, EKS, Lambda, CloudFormation, IAM, CloudWatch);
- Solid understanding of CI/CD pipelines and tools (e.g. GitHub Actions, Jenkins, CodePipeline, dbt Cloud);
- Hands-on experience with Infrastructure as Code (Terraform, AWS CDK, or CloudFormation);
- Familiarity with data orchestration tools (Airflow, Prefect, Dagster) and ETL/ELT frameworks;
- Proficient in Python or other scripting languages for automation and operational tasks;
- Experience with containerization and orchestration (Docker, Kubernetes);
- Good knowledge of monitoring and observability tools (Prometheus, Grafana, ELK, Datadog);
- Strong focus on reliability, automation, and scalability of data systems.
Smart working: 2 days per month on-site in Milan, with great flexibility.
Key Skills
Ranked by relevanceReady to apply?
Join Azienda Riservata Italia and take your career to the next level!
Application takes less than 5 minutes

