Job Title: Consultant | SAP datasphere | Bengaluru | SAP
SAP Databricks Developer (5–8 years)
Role Summary
As a SAP Databricks Developer, you will design, build, and optimize data pipelines using Databricks within SAP Business Data Cloud. You will focus on hands-on development, data processing, and integration of SAP and non-SAP data sources.
This role is execution-focused, ensuring reliable, high-performance data pipelines that support analytics and AI/ML use cases.
Responsibilities
• Develop and maintain ETL/ELT pipelines using PySpark ,Python and SQL
• Build and optimize data processing workflows using Databricks notebooks and SQL
• Ingest and integrate data from SAP systems and external sources
• Implement and manage Delta Lake tables
• Enable Delta Sharing for data exchange with SAP data products
• Work with object storage (e.g., HANA Data Lake Files) for large datasets
• Build and maintain data models and curated datasets
• Support AI/ML data preparation and feature engineering
• Ensure data quality, performance, and reliability of pipelines
• Troubleshoot and optimize data jobs
• Collaborate with architects and business teams
Required Skills/Experience
• 5–8 years of experience in data engineering or development
• Strong programming skills in Python, PySpark, and SQL
• Hands-on experience with Databricks
• Experience building ETL/ELT pipelines using Apache Spark
• Experience with cloud platforms (AWS, Azure, or GCP)
• Strong understanding of data lakes and large-scale data processing
• Knowledge of data modeling and analytics workflows
• Strong problem-solving and analytical skills