Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
WHY JOIN US
If you're looking for a place to grow, make an impact, and work with people who care, we'd love to meet you!
ABOUT THE ROLE
We are looking for a Mid-Level Data Engineer for a focused 2–3 month engagement to deliver high-impact, short-term data solutions through reliable, well-documented pipelines. This role centers on building clean Python and SQL workflows with Airflow, supporting fast-moving projects that require autonomy, precision, and strong ownership. It stands out for its delivery-driven culture, independent working model, and the opportunity to make immediate, measurable impact in a project-based data environment.
WHAT YOU WILL DO
- Design, implement, and maintain data pipelines using Python and SQL;
- Work with Airflow to orchestrate and monitor data workflows;
- Write clean, readable, and well-documented code (Python and SQL);
- Collaborate with stakeholders to understand data requirements and translate them into robust implementations;
- Ensure data reliability, performance, and clarity across pipelines and transformations;
- Contribute to short-term deliverables with a strong sense of ownership and urgency.
MUST HAVES
- 3+ years of experience as a Data Engineer or similar role;
- Strong Python and SQL skills , with a clear focus on code readability and documentation;
- Hands-on experience with Airflow;
- Solid understanding of columnar databases and analytical data models;
- High productivity without reliance on AI-assisted coding tools;
- Ability to work independently and deliver results in short-term engagements;
- Upper-intermediate English level.
NICE TO HAVES
- Experience with AWS data stack components;
- Exposure to Databricks, ClickHouse, or Vertica;
- Background working on short-term or project-based data initiatives.
PERKS AND BENEFITS
- Professional growth: Mentorship, TechTalks, and personalized growth roadmaps.
- Competitive compensation: USD-based pay with education, fitness, and team activity budgets.
- Exciting projects: Modern solutions with Fortune 500 and top product companies.
- Flextime: Flexible schedule with remote and office options.
Key Skills
Ranked by relevanceReady to apply?
Join AgileEngine and take your career to the next level!
Application takes less than 5 minutes

