Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
We are proud to partner with some of the world’s leading investors. New Enterprise Associates led our $22m Series B round in 2022, with Philip Chopin joining Sequoia’s Luciana Lixandru on our board.
We were founded in Switzerland in 2017 and today we operate globally from offices in Zurich and London. We encourage diversity and are an international team coming from 26 different countries and speaking 25 different languages.
As a Data Engineer at Ledgy, your mission is to build robust data pipelines, design scalable data architecture, and collaborate with teams to deliver insights that drive business decisions. Reporting directly into Head of Operations & AI, you’ll play a key role in driving our data engineering strategy.
At Ledgy, you will:
- Manage and optimize data infrastructure and ETL pipelines using Fivetran, Airbyte, and Google Cloud Platform, ensuring reliable data flow from multiple sources into our analytics ecosystem
- Develop, test, and maintain DBT models that transform raw data into analytics-ready datasets following best practices
- Create and manage LookMLmodels in Looker to enable self-service analytics for stakeholders across the company
- Drive continuous improvement of our data engineering practices, tooling, and infrastructure as a key member of the Operations team
- 2-3+ years experience building production data pipelines and analytics infrastructure, with DBT, SQL, and Python (Pandas, etc.)
- Experience implementing and managing ETL/ELT tools such as Fivetran or Airbyte
- Ideally hands-on experience with GCP (BigQuery)
- Proficiency in Looker, including LookML development
- Strong plus if you have experience using n8n or similar automation tools
- Experience with SaaS data sources (HubSpot, Stripe, Vitally, Intercom)
- Familiarity with AI-powered development tools (Cursor, DBT Copilot) and a strong interest in leveraging cutting-edge tools to improve workflow
- Strong problem-solving skills and ability to debug complex data issues
- Excellent communication skills with ability to explain technical concepts to non-technical stakeholders
Key Skills
Ranked by relevanceReady to apply?
Join Ledgy and take your career to the next level!
Application takes less than 5 minutes

