Job Title: Technology and Transformation - EAD - Engineering - Senior Consultant -GCP
We are seeking a skilled and proactive GCP Data Engineer with 3–5 years of experience to join our data engineering team. The ideal candidate will have hands-on experience designing, developing, and maintaining robust and scalable data pipelines on Google Cloud Platform (GCP). You will collaborate with cross-functional teams to enable data-driven decision-making through efficient data solutions.
Key Responsibilities:
- Design and implement scalable and efficient data pipelines using GCP services such as:
- BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Cloud Composer (Airflow)
- Develop and maintain ETL/ELT processes to ingest, transform, and load structured and semi-structured data
- Optimize query performance in BigQuery and manage cost-effective data architecture
- Work with stakeholders to gather data requirements and translate them into technical specifications
- Implement data quality checks, monitoring, and error handling mechanisms
- Manage and automate workflows using Cloud Composer
- Collaborate with data analysts, scientists, and business teams to deliver clean and reliable data
- Ensure data security, compliance, and governance across the pipelines
Required Skills:
- 3–5 years of hands-on experience in data engineering roles
- Strong experience with Google Cloud Platform (GCP), especially BigQuery, Cloud Storage, and Dataflow
- Proficient in SQL and Python for data processing and automation
- Experience with Airflow (Cloud Composer) for orchestrating workflows
- Good understanding of data modeling, data warehousing, and ETL best practices
- Familiarity with CI/CD pipelines and Git-based version control
- Problem-solving mindset with the ability to work independently and collaboratively