Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
- Join our Development Center and become a member of our open-minded, progressive and professional team. You will have a chance to grow your technical and soft skills, and build a thorough expertise of the industry of our client. In this role you will be working on projects for one our world famous clients, a large international investment bank.
- We are seeking a technically strong and hands-on Data Architect to design and own end-to-end data sourcing and integration solutions. This role focuses on architecting scalable, secure, and efficient data pipelines that ingest, process, and publish data from a wide variety of sources—including traditional databases, Hadoop, NoSQL systems, and file-based inputs—into our hybrid data platform spanning Google Cloud and on-premises infrastructure.
- This is a hands-on role, requiring active participation in solution implementation, including code commits, technical reviews, and close collaboration with engineering teams.
- On top of attractive salary and benefits package, Luxoft will invest into your professional training, and allow you to grow your professional career.
- Responsibilities:
- Key Responsibilities:
- • Solution Architecture
- o Design comprehensive data sourcing strategies and technical interfaces with feed providers.
- o Architect data pipelines for ingestion, transformation, and delivery to distribution layers such as BigQuery and Cloud Storage.
- o Ensure alignment with enterprise data governance, security, and performance standards.
- • Platform Integration
- o Define integration patterns for hybrid environments (GCP + on-prem).
- o Own the technical blueprint for data movement, preprocessing, and publication.
- • Stakeholder Collaboration
- o Engage with internal and external data providers to agree on technical specifications and delivery formats.
- o Collaborate with engineering teams to ensure architectural designs are implemented effectively.
- • Technical Execution
- o Contribute directly to codebases and configuration repositories.
- o Participate in code reviews, testing, and deployment processes.
Mandatory Skills Description:
- Required Skills & Experience:
- • Deep expertise in:
- o Oracle, Postgres, Hadoop, NoSQL databases
- o Data pipeline frameworks (e.g., Apache Spark, Airflow, Dataflow, Dataproc)
- o Google Cloud Platform (BigQuery, Cloud Storage, Pub/Sub, Composer, CloudRun, IAM)
- o Data virtualization tools such as Trino (Presto, or Denodo, or similar)
- o Cloud security and IAM (Identity and Access Management) best practices
- • Strong understanding of hybrid cloud architectures and data lifecycle management.
- • Experience designing scalable and secure data solutions in enterprise environments.
- • Excellent communication and stakeholder engagement skills.
Nice-to-Have Skills Description:
- Preferred Qualifications:
- • Familiarity with metadata management and data cataloging tools.
- • Exposure to CI/CD and DevOps practices for data platforms.
- • Experience in regulated industries or large-scale data environments.
Key Skills
Ranked by relevanceReady to apply?
Join Luxoft and take your career to the next level!
Application takes less than 5 minutes

