Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
DayOne is a global leader in the development and operation of high-performance data centers. As one of the fastest-growing companies in the industry, we’ve built a robust presence across Asia and Europe — and we’re just getting started.
As we expand into new international markets, we’re looking for talented, driven individuals to join us on this exciting journey. This is more than a job — it’s an opportunity to be a key contributor to our dynamic team and help shape the future of global data infrastructure.
If you're passionate about innovation, technology, and growth, we invite you to be part of DayOne’s next chapter.
What You’ll Be Doing
Your responsibilities will include:
- Assisting in designing and developing scalable ETL pipelines to process large volumes of data efficiently, integrating data from systems like Autodesk Construction Cloud (ACC), SAP, and other software.
- Collaborating with the team to integrate various data sources into Dataverse, automating data flows for seamless analysis and reporting.
- Building and maintaining RESTful APIs to streamline data exchange between internal and external systems.
- Analyzing and troubleshooting data pipeline performance to ensure high reliability and scalability.
- Supporting data integration efforts to ensure the timely availability of clean, accurate, and consistent data across platforms.
- Participating in code reviews and ensuring the implementation of best practices in data engineering.
- Currently pursuing a degree in Computer Science, Data Engineering, Information Technology, or a related field.
- Proficiency in Python, with hands-on experience in building ETL pipelines.
- Familiarity with API design, development, and integration (RESTful APIs), especially for systems like Autodesk Construction Cloud (ACC) and SAP.
- Understanding of Dataverse as a data storage solution and best practices for handling large datasets.
- Strong problem-solving skills with a passion for building scalable data solutions.
- Ability to work collaboratively in a team environment and communicate technical concepts effectively.
- Experience with cloud-based platforms like AWS, Azure, or Google Cloud.
- Knowledge of containerization technologies such as Docker.
- Exposure to data processing frameworks such as Apache Spark or Apache Kafka.
- Familiarity with CI/CD processes for data engineering tasks.
If you're ready to grow with one of the fastest-moving companies in the data center industry, apply now and be part of our global journey.
Key Skills
Ranked by relevanceReady to apply?
Join DayOne and take your career to the next level!
Application takes less than 5 minutes

