Skip to main content

Databricks with SAP BO Engineer

Consultancy Role in BFS-Capital Markets

Key Responsibilities

Design build and maintain scalable ETL/ELT pipelines using Databricks (PySpark Delta Lake SQL Warehouse).
Transform and curate data into bronze silver and gold layers following medallion architecture best practices.
Publish and expose gold layer datasets through Databricks SQL Warehouse for consumption by SAP BO.
Collaborate with BO developers to ensure semantic layer alignment.
Conduct data validation and reconciliation between Databricks outputs and BO report datasets.
Optimize data models queries and partitions for performance cost and scalability.

Required Skills & Experience

5 years of experience with Azure Databricks (PySpark Delta Lake SQL Warehouse).
Proficiency in SQL and data modelling (star/snowflake schemas).
Familiarity with SAP BusinessObjects universe and report structures able to validate and support BO data consumption.
Experience working in banking or financial data environments preferred

Required Experience:

IC

Key Skills
Academics,Facilities Management,CMS,Life Science,Linq,Hospital
Employment Type : Full-Time
Experience: years
Vacancy: 1

Databricks with SAP BO Engineer

Virtusa
Dubai - United Arab Emirates
Full time

Published on 05/06/2026

Share this job now