$SRC Ecosystem
Data & Integration Engineer
$SRC EcosystemEstonia13 days ago
ContractRemote FriendlyEngineering, Information Technology
Job Title: Data & Integration Engineer

Location: Remote
Employment Type: Contract
Department: Technology & Platform Engineering
Payment: in $SRC token ,native token of project

About $SRC Ecosystem

$SRC Ecosystem is an AI and blockchain-powered RWA tokenization platform addressing the $2.5 trillion global trade finance gap. By digitizing trade documents, tokenizing trade assets, and enabling cross-border payments with integrated FX, we empower SMEs worldwide to access fair financing. Our platform is built on zkEVM technology, driving efficiency, transparency, and liquidity in trade finance.

We are scaling rapidly and seeking a Data & Integration Engineer to join our core engineering team. This role is critical in ensuring smooth data flows, integrations with external partners, and enabling robust analytics that power our end-to-end trade execution platform.

Key Responsibilities

Design, develop, and maintain data integration interfaces between the $SRC Ecosystem and partner systems (banks, logistics providers, on/off ramping partners).

Implement and manage message bus systems (e.g., RabbitMQ, Kafka) for asynchronous communication and event-driven architecture.

Develop and optimize database schemas, queries, and stored procedures to ensure scalability and high availability.

Build and maintain the data warehouse for analytics, reporting, and business intelligence.

Support data modelling efforts to standardize trade, finance, and compliance data across the platform.

Implement low-code/no-code workflow solutions to accelerate automation for business processes and stakeholder integrations.

Collaborate with front-end, back-end, and AI engineering teams to ensure seamless integration of structured/unstructured data.

Support data analytics pipelines to provide real-time insights on trade flows, risk scoring, and liquidity movements.

Ensure all integrations follow security, KYC/KYB, and compliance standards in line with MiCA and global regulatory frameworks.

Requirements

Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.

6-10 years of experience in data engineering, system integration, or related roles.

Strong hands-on skills with databases (PostgreSQL, MySQL, MongoDB or similar).

Proven experience with data integration interfaces and APIs (REST, gRPC, Webhooks).

Hands-on experience with message bus systems (RabbitMQ, Kafka, or equivalent).

Experience in data warehousing and ETL pipelines (Snowflake, BigQuery, Redshift, or equivalent).

Strong foundation in data analytics and modelling.

Familiarity with low-code/no-code workflow platforms (e.g., n8n, Camunda, or similar).

Solid understanding of data security, encryption, and compliance frameworks.

Knowledge of blockchain data structures and Web3 integration is a plus.

Excellent communication and collaboration skills in cross-functional, distributed teams.

What We Offer

Opportunity to shape the future of global trade finance and tokenization.

Work with a high-performing, international team across blockchain, AI, and fintech.

Remote-first work culture with flexibility.

Exposure to Tier-1 global partners in trade, finance, and blockchain ecosystems.

Key Skills

Ranked by relevance