Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Key Responsibilities:
DevOps and CI/CD: Design, implement, and manage CI/CD pipelines using tools like Jenkins and GitOps to
automate and streamline the software development lifecycle.
Containerization and Orchestration: Deploy and manage containerized applications using Kubernetes and
OpenShift, ensuring high availability and scalability.
Infrastructure Management: Develop and maintain infrastructure as code (IaC) using tools like Terraform or
Ansible.
Big Data Solutions: Architect and implement big data solutions using technologies such as Hadoop, Spark, and
Kafka.
Distributed Systems: Design and manage distributed data architectures to ensure efficient data processing and
storage.
Collaboration: Work closely with development, operations, and data teams to understand requirements and deliver
robust solutions.
Monitoring and Optimization: Implement monitoring solutions and optimize system performance, reliability, and
scalability.
Security and Compliance: Ensure infrastructure and data solutions adhere to security best practices and
regulatory requirements.
Qualifications:
Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Experience: Minimum of 5 years of experience in big data engineering or a related role.
Technical Skills:
o Proficiency in CI/CD tools such as Jenkins and GitOps.
o Strong experience with containerization and orchestration tools like Kubernetes and OpenShift.
o Knowledge of big data technologies such as Hadoop, Spark, ETLs.
o Proficiency in scripting languages such as Python, Bash, or Groovy.
o Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible.
Soft Skills:
o Excellent problem-solving and analytical skills.
o Strong communication and collaboration abilities.
o Ability to work in a fast-paced, dynamic environment.
Preferred Qualifications:
Certifications in DevOps, cloud platforms, or big data technologies.
Experience with monitoring and logging tools such as Prometheus, Grafana, or ELK Stack.
Knowledge of security best practices in DevOps and data engineering.
Familiarity with agile methodologies and continuous integration/continuous deployment (CI/CD) practices.
Key Skills
Ranked by relevanceReady to apply?
Join HireAlpha and take your career to the next level!
Application takes less than 5 minutes

