Senior Data Scientist
GlobalLogic (Argentina)
GlobalLogic family
Expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment and be part of the forefront of digital transformation!
We offer an opportunity to participate in creating market-defining products using the latest technologies with clients across all industries and sectors. GlobalLogic prioritizes work-life balance, which is why we offer flexible opportunities and options.
Job Description
We’re looking for a teammate with:
The ideal candidate will have hands-on experience building and shipping production ML
systems, with deep proficiency in Python and the modern AWS ML stack:
• Python Expertise: Python is your primary language. You write clean, well-structured
code and are comfortable owning end-to-end ML workflows — from data ingestion and
EDA through model training, validation, and deployment.
• AWS SageMaker: Practical, hands-on experience with SageMaker as your primary ML
platform — including SageMaker Studio, Training Jobs, Pipelines, Model Registry, and
real-time or batch Inference Endpoints.
• Machine Learning Fundamentals: Strong grounding in supervised and unsupervised
ML methods — gradient boosting, neural networks, dimensionality reduction, clustering,
and survival/time-to-event models. Experience with frameworks such as scikit-learn,
XGBoost, LightGBM, and PyTorch or TensorFlow.
• Feature Engineering and Data Wrangling: Demonstrated ability to extract, clean, and
engineer features from complex, multi-source datasets using Python (pandas, numpy,
PySpark) and SQL against platforms such as Snowflake or similar cloud data
warehouses.
• Model Evaluation and Experimentation: Rigorous approach to model evaluation —
cross-validation, holdout testing, calibration, and business-metric alignment. Experience
with experiment tracking tools such as MLflow or SageMaker Experiments.
• Cloud and Infrastructure Awareness: Solid AWS experience beyond SageMaker,
including S3, IAM, Lambda, and Step Functions. Familiarity with infrastructure-as-code
or CI/CD patterns for ML pipelines is a plus.
• Data Platform Integrations: Hands-on experience working with Snowflake, Apache
Iceberg, or similar modern data platforms as upstream data sources for ML pipelines.
Familiarity with our client Cloud Analytics or our client Talend Cloud is a strong plus.
Beyond technical skills, we’re looking for someone who brings:
• Bias for Impact: You care about whether your models actually change decisions — not
just whether they score well on a leaderboard.
• Strong Communication: Ability to explain model behavior, limitations, and business
implications to non-technical stakeholders clearly and without jargon.
• Security and Governance Mindset: Awareness of responsible AI practices, data
privacy considerations, model auditability, and the importance of reproducibility in
production ML systems.
• Collaborative Spirit: Comfortable working across functions and levels, from data
engineers and CSMs to the C-suite.
Job Responsibilities
Here’s how you’ll be making an impact:
• Build and Deploy ML Models: Design, train, evaluate, and deploy supervised and
unsupervised machine learning models on AWS SageMaker — including classification,
regression, clustering, and anomaly detection use cases.
• Own the Feature Engineering Pipeline: Develop robust, reusable feature pipelines in
Python that transform raw data from Snowflake, our client Cloud Analytics, and other sources
into high-quality model inputs.
• Integrate with the Data Ecosystem: Connect model pipelines to our client Cloud Analytics,
our client Talend Cloud, Snowflake, and Apache Iceberg, ensuring data freshness, lineage,
and governance standards are met.
• Operationalize Models at Scale: Leverage SageMaker Pipelines, Model Registry, and
Endpoints to bring models into production reliably — with monitoring, drift detection, and
retraining workflows in place.
• Support LLM-Augmented Workflows: Collaborate with AI Systems Engineers to
integrate predictive model outputs as structured signals into agentic AI pipelines
deployed on AWS Bedrock.
• Translate Signals into Action: Partner with Customer Success, Sales, and Analytics
stakeholders to translate model outputs into actionable insights, dashboards, and
automated intervention triggers.
• Iterate and Instrument: Operate in a fast-moving incubator environment — prototype
quickly, measure model performance against business outcomes, and continuously
refine based on real usage signals.
• Document and Govern: Maintain clear model cards, experiment logs, and data lineage
documentation in support of our client’s AI governance framework and ISO 42001 compliance posture.
About GlobalLogic
GlobalLogic is a Hitachi Group Company, leader in digital engineering. Based across 14 countries. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries.
GlobalLogic is operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business
Key Skills
Ranked by relevance
Related Jobs
3 roles aligned with this opportunity
Site Reliability Engineer
2026-04-10
DevOps Engineer
2026-04-10
Site Reliability Engineer
2026-04-07
- Posted
- Mar 30, 2026
- Type
- Full-time
- Level
- Mid-Senior
- Location
- Argentina
- Company
- GlobalLogic
Industries
Categories
Related Jobs
3 roles aligned with this opportunity
Site Reliability Engineer
2026-04-10
DevOps Engineer
2026-04-10
Site Reliability Engineer
2026-04-07