Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
We are seeking a talented and experienced Data Engineer to join our team at Provectus. As part of our diverse practices, including Data, Machine Learning, DevOps, Application Development, and QA, you will collaborate with a multidisciplinary team of data engineers, machine learning engineers, and application developers. You will encounter numerous technical challenges and will have the opportunity to contribute to the internal solutions, engage in R&D activities, providing an excellent environment for professional growth.
Responsibilities:
- 5+ years of experience in data engineering;
- Experience in AWS;
- Experience handling real-time and batch data flow and data warehousing with tools and technologies like Airflow, Dagster, Kafka, Apache Druid, Spark, dbt, etc.;
- Proficiency in programming languages relevant to data engineering, such as Python and SQL;
- Proficiency with Infrastructure as Code (IaC) technologies like Terraform or AWS CloudFormation;
- Experience in building scalable APIs;
- Familiarity with Data Governance aspects like Quality, Discovery, Lineage, Security, Business Glossary, Modeling, Master Data, and Cost Optimization;
- Upper-Intermediate or higher English skills;
- Ability to take ownership, solve problems proactively, and collaborate effectively in dynamic settings
- Experience with Cloud Data Platforms (e.g., Snowflake, Databricks);
- Experience in building Generative AI Applications (e.g., chatbots, RAG systems);
- Relevant AWS, GCP, Azure, Databricks certifications;
- Knowledge of BI Tools (Power BI, QuickSight, Looker, Tableau, etc.);
- Experience in building Data Solutions in a Data Mesh architecture
- Collaborate closely with clients to deeply understand their existing IT environments, applications, business requirements, and digital transformation goals;
- Collect and manage large volumes of varied data sets;
- Work directly with ML Engineers to create robust and resilient data pipelines that feed Data Products;
- Define data models that integrate disparate data across the organization;
- Design, implement, and maintain ETL/ELT data pipelines;
- Perform data transformations using tools such as Spark, Trino, and AWS Athena to handle large volumes of data efficiently;
- Develop, continuously test, and deploy Data API Products with Python and frameworks like Flask or FastAPI
- Long-term B2B collaboration;
- Paid vacations and sick leaves;
- Public holidays;
- Compensation for medical insurance or sports coverage;
- External and Internal educational opportunities and AWS certifications;
- A collaborative local team and international project exposure
Key Skills
Ranked by relevanceReady to apply?
Join Provectus and take your career to the next level!
Application takes less than 5 minutes

