Job Title:  Senior Consultant | Data Engineer | Pune | Engineering

Job requisition ID ::  100040
Date:  Apr 1, 2026
Location:  Pune
Designation:  Senior Consultant
Entity:  Deloitte Touche Tohmatsu India LLP

Your potential, unleashed.

 

India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond.

 

At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters

 

What impact will you make? 

Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services,

Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential.

 

The Team 

Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning.

 

Role – Data Engineer ( RAG / APIs)

Total experience -  3-9 years

Preferred Location – Pune

Education – Graduate

Must have skills – RAG , Agentic AI , BigQuery

Key Responsibilities

  • Design and maintain approved data domains for agentic systems; build/curate BigQuery datasets for retrieval.
  • Implement grounding connectors, search pipelines, and RAG data flows.
  • Develop caching layers, context shaping logic, and response optimization for agent needs.
  • Manage data quality checks, lineage tracking, PII masking, and data minimization.
  • Build performant retrieval pipelines (indexing strategies, filters, similarity configs).
  • Develop APIs for data retrieval, transformation, and feature delivery.
  • Implement observability for data/ML pipelines: logs, metrics, tracing, error detection.
  • Optimize query performance and storage costs across BigQuery, GCS, and vector search systems.

 

Required Skills & Qualifications

  • Bachelor’s in Computer Science, Data Engineering, or related field.
  • 4+ years of experience in data engineering; strong GCP expertise mandatory.
  • Proficiency in BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Functions.
  • Experience with RAG pipelines, embeddings, vector indexing, and retrieval optimization.
  • Strong SQL and Python skills; experience with ETL/ELT frameworks.
  • Exposure to API development using Python/FastAPI/Go.
  • Strong understanding of data governance, PII protection, and compliance.
  • Experience with CI/CD for data pipelines.

 

Preferred Skills

  • Experience with Vertex AI Search, Vertex AI Vector Search, or Elasticsearch/OpenSearch.
  • Google Cloud Professional Data Engineer certification.
  • Familiarity with ML feature store design and operationalizing vector stores.