Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
We build the tech that moves industries forward. We have our eyes set on AI, energy, logistics, sports and other complex and exciting segments.
We believe in an innovative approach to solving deep issues and encourage our people to find their own solutions. We are constantly rethinking processes, business models, architecture, and tech stacks.
We foster a sense of curiosity, experimentation, and passion beyond code. With us, you can easily deepen your knowledge in any field you’re curious about. And because we work across many industries, you’ll be gaining the experience others can only dream of.
At the forefront of reimagining how industries operate, we are a team of builders and thinkers reshaping e-commerce, ticketing, and logistics from first principles. Our work is grounded in curiosity, experimentation, and a drive for real business impact. We eliminate inefficiencies—not just in code, but in legacy models and outdated assumptions. For those who seek to solve complex problems and mentor others in the process, this is a place to thrive.
We are looking for a Senior Data Engineer who combines deep technical expertise with a strategic mindset. In this role, you’ll architect and build intelligent data ecosystems that power autonomous workflows—integrating Generative and Agentic AI to help businesses move faster, think smarter, and operate more efficiently. Equal parts architect and builder, you’ll be instrumental in delivering high-impact, AI-powered solutions across diverse industries.
- Analyze and optimize business processes by collaborating with stakeholders to uncover inefficiencies and define data requirements for automation
- Design scalable, modular data architectures that integrate with Generative AI and Agentic AI systems to support real-time decision-making
- Engineer robust ETL/ELT pipelines using Python, cloud-native services, and orchestration tools, supporting both batch and streaming data needs
- Architect RAG and vector database solutions using semantic search to enable LLMs to retrieve curated, context-rich business data
- Build intelligent data products, from predictive models and decision engines to AI-driven insights platforms
- Implement data quality, validation, and governance frameworks to ensure data integrity, lineage, and compliance across systems
- Lead technical discovery sessions with clients to transform complex business challenges into AI and data-driven opportunities
- Mentor team members on best practices in data engineering, AI integration, and modern cloud architectures
- Expert-level Python proficiency for data engineering, including API integrations, data transformations (Pandas, PySpark), and automation
- Proven experience designing and deploying large-scale data platforms on AWS, GCP, or Azure
- Strong foundation in building production-grade ETL/ELT pipelines using Apache Airflow, Kafka, Spark, or cloud-native tools
- Hands-on experience with vector databases (e.g., Pinecone, Weaviate, Chroma, Milvus) and implementing semantic search
- Demonstrated knowledge of Generative AI and LLMs, with practical experience in RAG architectures and prompt engineering
- Deep understanding of data governance, quality, and documentation, with a focus on lineage, metadata, and compliance
- Familiarity with cloud services including serverless computing, managed databases, and data warehouses such as BigQuery, Redshift, or Snowflake
- Experience working with complex real-world data environments, including legacy systems, SaaS integrations, APIs, and databases
- Fluency in Lithuanian and English, both written and spoken
- A working culture that is high performing, ambitious, collaborative and fun
- Health insurance and a yearly training budget (local and international conferences, language courses), employee-led workshops
- Flexible working hours
- Unlimited WFH (work from home) policy
- Extra vacation days: 2 after working at NFQ for two years and 4 after four years on our team
- Bonus for referrals
- For those who dream of traveling: WFA (work from anywhere) possibilities in NFQ - approved countries
- Office perks and team activities
Salary range: € 3850 - 6600 gross / month
If you have any questions, please contact me at [email protected] or via Linkedin.
Check all our career opportunities here.
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Key Skills
Ranked by relevanceReady to apply?
Join NFQ and take your career to the next level!
Application takes less than 5 minutes

