EIGN
Back-End Data Engineer
EIGNUnited Arab Emirates12 hours ago
Full-timeRemote FriendlyEngineering, Information Technology

Eign (eign.com) is a location intelligence startup empowering the real estate industry with actionable geospatial insights. We aggregate and enrich vast datasets to deliver location scores, neighborhood intelligence, and predictive tools to real estate developers, investors, and platforms.

We’re seeking a highly capable Back-End Data Engineer to design and build robust, scalable, and cost-efficient data infrastructure. You will play a central role in ingesting and transforming data from numerous APIs and internal sources, helping shape our data architecture to support millions of location-specific data points across multiple markets.

Key Responsibilities:

  • Build and manage end-to-end data pipelines ingesting data from 3rd-party APIs and internal data streams.
  • Design cost-optimized workflows that minimize external API usage through smart caching and batching strategies.
  • Develop robust back-end services and APIs to expose data to our internal systems and user-facing products.
  • Implement and manage scalable storage systems, including data warehouses and data lakes.
  • Ensure strong backend architecture to support data workflows and API performance.
  • Monitor and maintain data quality, accuracy, and freshness across all systems.
  • Establish automated monitoring, alerting, and logging systems to ensure data reliability.

Required Skills & Experience:

  • 4+ years of experience in back-end or data engineering roles.
  • Strong knowledge of Python and SQL.
  • Proficiency in developing RESTful APIs and microservices using frameworks such as FastAPI, Flask, or similar.
  • Hands-on experience integrating and managing data from public and private APIs such as Google APIs, government data sources, Fitch or financial data feeds, and other large-scale open/public APIs.
  • Experience building and managing data pipelines with tools like Apache Airflow, dbt, or similar.
  • Familiarity with stream processing using Apache Kafka or similar platforms.
  • Proven ability to design scalable solutions with data warehouses like BigQuery, Redshift, or Snowflake.
  • Experience with in-memory caching systems like Redis or Memcached.
  • Working knowledge of cloud environments such as Azure and Google Cloud Platform (GCP).
  • Experience working with geospatial data and APIs is a strong plus.

Nice to Have:

  • Experience with cost monitoring and optimization in cloud-based environments.
  • Familiarity with CI/CD, Terraform, or infrastructure-as-code practices.

What We Offer:

  • The opportunity to join a fast-growing proptech startup at the ground floor.
  • Work on a mission-driven product with significant real-world impact.
  • Hybrid work culture.
  • Competitive salary and equity options.


Key Skills

Ranked by relevance