Job Title: Consultant | GCP Data Engineer | Mumbai | Engineering
Experience
- 3+ years of hands-on experience in designing, developing, and maintaining data pipelines on Google Cloud Platform (GCP).
Key Responsibilities
- Design, develop, and maintain batch and streaming data pipelines on GCP.
- Build data solutions using BigQuery, Cloud Storage, Dataflow, Dataproc, and Cloud Composer (Airflow).
- Work with GCP databases including Cloud SQL, Bigtable, Spanner, and AlloyDB under guidance.
- Develop data processing logic using SQL and at least one programming language (Python, Java, or Scala).
- Orchestrate data workflows using Cloud Composer / Apache Airflow.
- Participate in CI/CD pipeline implementation using tools such as Git, Jenkins, SonarQube, Artifactory, and Docker.
- Support data and application migration projects, including on‑premises to cloud and cloud‑to‑cloud migrations.
- Implement data quality checks, validation rules, and basic monitoring.
- Apply GCP best practices for performance, reliability, and cost optimization.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.
- Troubleshoot and resolve data pipeline and performance issues.
- Document solutions, data flows, and operational procedures.
Required Skills
- Hands-on experience with GCP data services
- Strong SQL skills
- Proficiency in Python / Java / Scala
- Understanding of data modeling, data warehousing, and big data concepts
- Familiarity with DevOps and CI/CD practices
- Good communication and collaboration skills
Preferred Qualifications
- Google Cloud Certified – Professional Data Engineer
- Exposure to streaming architectures (Pub/Sub, Dataflow streaming)
- Experience working in Agile delivery teams