Burgan Bank Türkiye
Senior Data Engineer
Burgan Bank TürkiyeTurkey9 days ago
Full-timeOther

About the Role:

We are seeking an experienced Data Engineer to join our team at Burgan Bank Turkey, working on our cutting-edge Data Lakehouse infrastructure. This is an exciting opportunity to be part of a transformative initiative that will shape the future of data infrastructure at our organization.


As a Data Engineer, you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure that enable advanced analytics and data-driven decision making across the bank.


Key Responsibilities:

• Design and develop robust, scalable data pipelines for both batch and streaming data processing

• Build and maintain data lakehouse architecture using modern data technologies

• Implement and optimize data models and data products to support analytical and operational use cases

• Develop and maintain ETL/ELT processes and workflow orchestrations

• Process large-scale data for distributed computing

• Implement real-time data streaming solutions

• Manage and optimize analytical databases for high-performance analytics

• Design and implement comprehensive monitoring and alerting systems for data pipelines

• Establish and maintain CI/CD pipelines

• Collaborate with data scientists, analysts, data product managers and business stakeholders to understand requirements

• Ensure data quality, reliability, and performance of data systems

• Troubleshoot and resolve data pipeline issues in production environments

• Document data engineering processes, architectures, and best practices


Required Qualifications:

• Bachelor's degree in Computer Science, Engineering or a related fields

• 5+ years of hands-on experience in data engineering or related roles

• Strong proficiency in Python for data processing

• Extensive experience with Apache Spark for batch and streaming data processing

• Proven experience with Apache Airflow for workflow orchestration and scheduling

• Hands-on experience with Apache Kafka for real-time data streaming

• Experience with either Apache Flink or RisingWave for stream processing

• Strong knowledge and experience with ClickHouse for analytical databases

• Solid understanding of Apache Iceberg for data lake table formats

• Experience in data modeling for both operational and analytical workloads

• Proven track record in implementing monitoring and alerting solutions for data pipelines

• Strong experience with CI/CD practices and tools (Azure Devops, Solarwinds)

• Excellent SQL skills and experience with relational databases

• Understanding of distributed systems and big data architectures

• Strong problem-solving skills and ability to debug complex data pipeline issues

• Good command of English (both written and verbal)


Nice to Have:

• Experience in the banking or financial services industry

• Knowledge of data governance, data lineage, data quality frameworks and regulatory compliance

• Familiarity with containerization and orchestration tools (Docker, Kubernetes)

• Experience with data cataloging and metadata management tools

• Knowledge of additional data processing frameworks and tools

• Experience with data security and access control mechanisms

• Understanding of data warehousing concepts and dimensional modeling

• Familiarity with Agile/Scrum methodologies

Key Skills

Ranked by relevance