Our Purpose
Xceedance Australia PTY Ltd.(www.xceedance.com) is a global provider of strategic consulting and managed services, technology, data sciences and blockchain solutions to insurance organizations. Domiciled in Bermuda, with offices in the United States, United Kingdom, Germany, Poland, India, and Australia, Xceedance helps insurers launch new products, drive operations, implement intelligent technology, deploy advanced analytic capabilities, and achieve business process optimization. The experienced insurance professionals at Xceedance enable insurers, reinsurers, brokers, and program administrators worldwide to enhance policyholder service, enter new markets, boost workflow productivity, and improve profitability. Xceedance has achieved phenomenal growth in the past six years — a tribute to the knowledge, scope and impact of our people around the world. Everyone is laser focused on delivering value to our customers. We are committed to the communities in which we live and work. We are driven by a culture of innovation and integrity. As a member of the Xceedance team, you can shape a fulfilling career, participate in exciting projects and impact the organization in meaningful ways. Count on strong support to develop skills, grow quickly and meet your professional aspirations. Relish working in a highly collaborative setting that features state-of- the-art resources, modern technology, and a comfortable, gratifying environment. Create solutions and fulfill your role alongside highly talented and dynamic colleagues who will motivate you to be agile and extremely productive. And enjoy the advantages of a superior benefits package.
Our Vision & Strategic Focus:
To help catalyze insurance ecosystems for the benefit of the society.
- Global Insurance expertise and a customer centric approach
- Vertical focused and technology enabled
- Learning Organization
- People – First Culture
Job Title: Data Architect
Skills: Snowflake | SSIS | Azure Data Factory | SQL
Location: Sydney, Australia
Employment Type: Permanent
About the Role
We are seeking a highly skilled Data Architect with deep experience designing and implementing enterprise data solutions in the insurance domain, including Duck Creek Clarity/Insights. The ideal candidate will bring strong hands-on and architectural expertise across Snowflake, SSIS, and Azure Data Factory (ADF), driving scalable, secure, and governed data platforms that power reporting, analytics, and operational insights.
You will partner with business stakeholders, Duck Creek platform teams, engineering, and analytics teams to define data architecture standards, shape integration patterns, and deliver reliable data products supporting underwriting, claims, policy, billing, and customer analytics.
Key Responsibilities
Data Architecture & Strategy
- Define and own the end-to-end data architecture for Duck Creek-centric ecosystems, spanning ingestion, transformation, semantic layers, and consumption.
- Establish target-state architecture for Clarity/Insights data, including integration patterns with enterprise data platforms (Snowflake/Azure).
- Create and maintain conceptual, logical, and physical data models (policy, billing, claims, producer, customer, financial).
- Lead data platform modernization, including migration and re-platforming from legacy EDW/ETL to Snowflake and cloud-native patterns.
- Define standards for data quality, metadata, lineage, and governance across the data lifecycle.
Duck Creek Clarity & Insights
- Architect data flows for Duck Creek Clarity reporting and operational data extraction from Duck Creek systems.
- Design robust ingestion and downstream analytics architectures leveraging Duck Creek Insights datasets, reporting, and metrics.
- Partner with Duck Creek functional/technical teams to interpret data structures, events, and domain mappings, ensuring consistent definitions and reporting truth.
Snowflake Platform Architecture
- Design scalable Snowflake solutions: schemas, warehouses, clustering, micro-partitioning strategy, data sharing, and cost/performance optimization.
- Implement patterns for ELT in Snowflake using SQL-based transformations and orchestration tools (ADF/SSIS).
- Define data security architecture in Snowflake using RBAC, masking policies, row access policies, and secure views (as applicable).
Integration & Pipelines (SSIS + ADF)
- Build and oversee robust pipelines using Azure Data Factory (ADF) for ingestion/orchestration, scheduling, monitoring, retries, and alerting.
- Design, build, and optimize SSIS packages for hybrid integrations, legacy feeds, and incremental loads where required.
- Establish patterns for CDC/incremental loads, auditing, reconciliation, and operational observability.
Governance, Quality & Observability
- Define and implement data quality rules, monitoring, and remediation processes (DQ scorecards, exception handling).
- Own metadata/lineage approaches and documentation, aligning with enterprise governance frameworks.
- Promote best practices in DevOps for data, including CI/CD, code reviews, branching strategies, and environment management.
Stakeholder & Delivery Leadership
- Translate business requirements into architecture deliverables (HLD/LLD, integration specs, data contracts).
- Act as a technical leader/mentor for data engineers and analysts; drive design sessions and solution reviews.
- Collaborate across security, infrastructure, architecture, and compliance teams to ensure solutions are production-grade.
Required Skills & Experience
Must-Have
- 12+ years in data engineering/architecture roles with enterprise data platforms.
- Strong experience with Duck Creek Clarity and Duck Creek Insights (reporting/analytics/data structures).
- Strong expertise with Snowflake (data modeling, performance tuning, security, and cost optimization).
- Hands-on experience with Azure Data Factory (ADF) (pipelines, integration runtimes, triggers, parameterization).
- Strong hands-on experience with SSIS (package design, deployment, troubleshooting, performance tuning).
- Strong hands-on experience in migrating data from legacy to Duck Creek system
- Proven experience designing dimensional (Kimball) and/or Data Vault models for analytics.
- Strong SQL skills; proven ability to optimize complex queries and transformations.
- Insurance domain knowledge: policy lifecycle, billing, claims, commissions, reinsurance is a huge plus
Nice-to-Have
- Azure ecosystem: ADLS Gen2, Azure Synapse, Databricks, Key Vault, Azure Monitor/Log Analytics.
- Experience implementing real-time/near-real-time patterns (Kafka/Event Hub, streams).
Education & Certifications
- Bachelor’s/Master’s in Computer Science, Information Systems, or equivalent experience.
- Preferred: Duck Creek Insights/Clarity certification, Snowflake certification, Azure (DP-203), or equivalent.
Deliverables & Artifacts You’ll Produce
- Target-state architecture diagrams (current vs future), integration reference architectures
- Canonical data models and domain mapping for Duck Creek data sources
- Snowflake design standards: naming conventions, environment layout, warehouse sizing strategy
- Data pipeline patterns (ADF/SSIS), error handling frameworks, logging and audit models
- Data quality framework and governance playbooks
What We Offer You
At Xceedance, you will discover the possibility to exceed yourself and grow persistently with us. We offer career development opportunities to our members for a well-rounded development throughout their journey with us.
Key Skills
Ranked by relevance
Related Jobs
3 roles aligned with this opportunity
Data Scientist
2026-04-11
JAVA Fullstack Developer
2026-04-11
Backend Application Developer
2026-04-11
- Posted
- Mar 31, 2026
- Type
- Full-time
- Level
- Director
- Location
- Sydney
- Company
- Xceedance
Industries
Categories
Related Jobs
3 roles aligned with this opportunity
Data Scientist
2026-04-11
JAVA Fullstack Developer
2026-04-11
Backend Application Developer
2026-04-11