Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Huspy is one of the leading property technology companies in EMEA.
Launched in 2020, we now operate in multiple cities across the UAE and Spain, expanding into Saudi Arabia and 3 more European markets by 2026. Today, we own the largest portion of the UAE mortgage market and are one of the fastest-growing players in every European city we’ve entered.
We’ve raised over $140 million (Series A and Series B) from the world’s top investors, including Sequoia Capital and Balderton Capital, to reshape the homebuying journey through powerful technology and agent-first tools.
We’ve built a SuperApp that empowers real estate agents and mortgage brokers, bringing cutting‑edge technology to one of the world’s most traditional industries. We’re transforming how property transactions happen — faster, smarter, and better for everyone.
We’re not slowing down. The question is: will you be part of what’s next?
The Main Event: What You’ll Drive, Build, and Own
- Design and Scale Data Pipelines: Build and maintain reliable batch and streaming pipelines that connect our applications, events, and external sources to Snowflake, ensuring data flows seamlessly across our platform.
- Streaming Data at Scale: Work with Kafka, Schema Registry, and event-driven architectures to power real-time data products and ensure downstream systems get trustworthy, well-structured events.
- Pipeline Orchestration: Use Airflow to orchestrate complex workflows, automate dependencies, and ensure data availability for analytics and business processes.
- Data Infrastructure & Cloud Services: Leverage AWS tools (ECR, ECS, S3, etc.) to build scalable, containerized data services and optimize storage, compute, and costs.
- Data Quality & Observability: Implement testing frameworks, monitoring, and alerting for data pipelines to ensure reliability and trust at scale.
- Infrastructure as Code: Manage data infrastructure with tools like Terraform, Helmfile, or CloudFormation, automating deployments and ensuring reproducibility.
- Optimize for Scale and Performance: Continuously improve the throughput, latency, and cost efficiency of pipelines, ensuring the platform scales with the business.
- Modeling & Governance: Apply strong data modeling fundamentals (e.g., Kimball, Data Vault) and enforce governance around PII, compliance, and lifecycle management.
- Collaboration for Impact: Partner with analysts, business stakeholders, and product teams to translate requirements into well-designed pipelines and models that unlock business value.
- Automation & CI/CD: Implement testing, linting, and automated deployments for data pipelines to ensure high-quality, production-ready code.
- Proven Experience: 4+ years in a Data Engineering role (or Analytics Engineering with strong pipeline/infrastructure experience).
- Python First: Proficiency in Python for ETL, orchestration, and automation.
- Airflow Practitioner: Hands-on experience building DAGs and managing workflows in Airflow (or equivalent orchestration tool).
- Streaming & Batch Expertise: Comfort working with event streaming systems (Kafka, Schema Registry) as well as traditional batch pipelines.
- Cloud Native: Practical experience with AWS services (S3, ECR, ECS, IAM, etc.).
- SQL & Warehousing Skills: Strong SQL and exposure to cloud warehouses (Snowflake preferred).
- Modeling Mindset: Ability to design data models that balance scalability, simplicity, and business needs.
- Stakeholder Collaboration: Skilled at working with product, ops, and analytics teams to turn business requirements into working data solutions.
Key Skills
Ranked by relevanceReady to apply?
Join Huspy and take your career to the next level!
Application takes less than 5 minutes