Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Position Details
Position: Data Engineer / Apache Airflow Specialist
Level: Senior
Technologies: Apache Airflow, Python, Flask, Elasticsearch, Unix/Linux, Oracle, PostgreSQL, GitLab
Workload: 1100 hrs/year
Location: Remote — work from anywhere
English level: Advanced
Education: Technical degree
Role Description
Intetics is looking for an experienced Data Engineer / Apache Airflow Specialist to join our distributed team for a data-driven project focused on large-scale ETL workflows, data indexing, and performance optimization.
The specialist will design, implement, and optimize data pipelines using Apache Airflow, manage database performance on Oracle and PostgreSQL, and support Elasticsearch integration for efficient data retrieval and search operations.
You will also be responsible for ensuring smooth deployment pipelines in GitLab and collaborating with a cross-functional engineering team to enhance the quality and reliability of complex data processes.
Requirements
Technical Responsibilities and Skills
What You'll Do:
- Develop, orchestrate, and maintain complex Apache Airflow DAGs for ETL and data-processing pipelines.
- Build and optimize Python-based ETL scripts, integrating with Flask APIs when needed.
- Design and manage Elasticsearch indexing and performance tuning workflows.
- Handle Unix/Linux scripting and operations for automation and monitoring.
- Work with Oracle and PostgreSQL databases for large-scale data processing.
- Implement and maintain GitLab CI/CD pipelines for build, test, and deploy stages.
- Collaborate with the project team to ensure scalability, reliability, and quality of data solutions.
- ≥ 3 years of Apache Airflow DAG orchestration.
- ≥ 5 years of Python (ETL focus), with Flask API experience as a plus.
- ≥ 3 years of Elasticsearch (data indexing & optimization).
- ≥ 3 years of Unix/Linux scripting & operations.
- ≥ 3 years with Oracle or PostgreSQL (ideally both).
- ≥ 3 years of GitLab pipelines (build/test/deploy).
- Advanced English and a technical degree.
- Experience with Great Expectations or similar data-quality tools.
- Airflow on Kubernetes.
- Proven performance tuning experience
Key Skills
Ranked by relevanceReady to apply?
Join Intetics and take your career to the next level!
Application takes less than 5 minutes

