Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
What if your work could drive change in a globally established industry, shaping processes that touch every corner of the world? At Forto, we are at the forefront of change, harnessing the power of AI to revolutionise logistics. We want to reinvent digital supply chains to be transparent, frictionless and sustainable. From day one, our mission has been to simplify global trade – creating a seamless and efficient logistics process.
About The Role
As a Data Engineer, you will play a crucial role in building, maintaining, and scaling Forto’s modern cloud-based data infrastructure. You will support and empower stakeholders throughout the company as we move toward globally scaled, highly automated, and data-driven operations.
Our Current Data Stack Includes
- BigQuery for data warehousing
- Google Cloud Pub/Sub for ingestion
- dbt for data transformation
- Apache Airflow (on Composer) for orchestration
- Python for scripting
- Monte Carlo and Datadog for observability
- Design, build, and maintain scalable and reliable data systems to support Forto’s innovation in the logistics industry.
- Collaborate with cross-functional teams—including Software Engineering, Analytics, and Product—to deliver impactful data products.
- Drive the adoption of robust, high-throughput data processing architectures.
- Partner with stakeholders to promote a strong, data-driven culture.
- Manage and optimize GCP infrastructure, including CI/CD workflows, Terraform, and container orchestration.
- Contribute to the ongoing development of our DevOps and Data Engineering best practices.
- Proficient in Python, with a strong understanding of data processing libraries, testing strategies, and best practices
- Highly skilled in SQL, particularly with BigQuery, and experienced with modern data platforms
- Demonstrated experience building and maintaining large-scale pipelines using Apache Airflow
- Genuine interest in AI/LLM engineering, including RAG, fine-tuning, prompt engineering, or LLMOps
- Possess a growth mindset with a drive to learn and keep pace with advancements in data engineering
- Experienced in cross-functional environments and comfortable engaging in architectural discussions, stakeholder meetings, and collaborative working models (pair/mob sessions)
- Fluent in written and spoken English
- Willing to work in a hybrid arrangement, spending at least 2 days per week in the office
Why work with us?
Our team is hard-working, constantly seeking to maximise the impact of their work, but we put our people first, always winning with care. We value efficient systems and swift, direct communication. We want everyone to have their time to speak, so that we can embrace diverse perspectives to help drive towards solutions always.
Key Skills
Ranked by relevanceReady to apply?
Join Forto and take your career to the next level!
Application takes less than 5 minutes

