SGI
Data Engineer
SGILithuania3 days ago
ContractEngineering, Analyst +1

Role Overview

As a Data Engineer supporting the account, you will be responsible for designing, building, and maintaining scalable data pipelines and analytics platforms that support insurance operations, claims processing, and AI-driven workflows. You will work closely with architects, analysts, and AI engineers to deliver high-performance data solutions aligned with transformation goals.


Key Responsibilities

  • Develop and optimize ETL pipelines using Azure Data Factory, Databricks, and PySpark
  • Migrate legacy data workflows (e.g., SSIS, Autosys) to cloud-native solutions on Azure
  • Build and maintain Delta Lake architectures and data lakes for structured and unstructured data ingestion
  • Support human-in-the-loop (HITL) data intake processes and AI model retraining workflows
  • Implement RBAC, audit logging, and compliance features for secure data operations
  • Collaborate with cross-functional teams to define data requirements and deliver business-aligned solutions.


Required Skills & Experience

  • 5+ years of experience in data engineering, preferably in insurance or financial services.
  • Strong proficiency in Azure, Databricks, PySpark, and SQL.
  • Experience with data lake architecture, Delta Lake, and cloud migration strategies.
  • Familiarity with workflow orchestration, metadata management, and data quality frameworks.
  • Understanding of AI-enhanced data workflows and document intelligence platforms.
  • Excellent problem-solving and communication skills.


Preferred Qualifications

  • Experience with Azure Functions, Azure Batch, and event-driven architectures.
  • Exposure to document ingestion, field extraction, and HITL UI design.
  • Knowledge of insurance business processes, especially claims and underwriting.

Key Skills

Ranked by relevance