Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment every day, there’s no better place - guaranteed! :)
What you will do
- Design, build, and maintain scalable, high-performance backend services using Go (Golang) and Python for data pipeline automation;
- Optimize real-time market data processing pipelines for accuracy, speed, and reliability;
- Work with Kafka, Redis, MongoDB, and SQL to manage and process large-scale financial datasets;
- Implement and enhance gRPC-based services to support low-latency data exchange;
- Collaborate with data engineers and product teams to ensure seamless data ingestion and API performance;
- Contribute to infrastructure improvements, leveraging cloud platforms (AWS, Google Cloud, or Azure) and container orchestration tools like Docker and Kubernetes;
- Develop and maintain CI/CD pipelines to streamline deployments;
- Ensure data quality and integrity across all systems, implementing monitoring and logging tools;
- Participate in code reviews, technical discussions, and architectural decisions;
- Troubleshoot and debug complex backend issues, ensuring system stability.
- 3+ years of experience in backend development with Go (Golang) and Python;
- Strong understanding of distributed systems and event-driven architectures;
- Experience with Kafka, Redis, MongoDB, and SQL for real-time data processing;
- Proficiency in gRPC for efficient API communication, including streaming (gRPC Stream);
- Ability to optimize high-throughput, low-latency systems;
- Experience with cloud platforms (AWS, Google Cloud, or Azure);
- Familiarity with containerization (Docker, Kubernetes) and microservices architectures;
- Strong problem-solving skills and ability to work in a fast-paced, data-driven environment;
- Good communication skills, with experience working in cross-functional teams;
- Upper-Intermediate English level.
- Experience working with real-time financial market data and understanding of data quality principles;
- Experience with C++;
- Familiarity with crypto ecosystems (CeFi/DeFi), blockchain analytics, or digital asset trading;
- Knowledge of time-series databases and efficient data storage techniques;
- Exposure to high-frequency trading (HFT), algo trading, or market surveillance tools;
- Understanding of security best practices in API development and data handling.
- Professional growth
- Competitive compensation
- A selection of exciting projects
- Flextime
Next Steps After You Apply
The next steps of your journey will be shared via email within a few hours. Please check your inbox regularly and watch for updates from our Internal Applicant site, LaunchPod, which will guide you through the process.