Job: Senior GCP Data Engineer

Senior GCP Data Engineer
  • Bangalore, Karnataka

Job Description

Senior GCP Data Engineer

Required Skills:

  • 5+ years experience with GCP services (Cloud Storage, Pub/Sub, Dataflow, Dataproc, Cloud Composer)
  • Strong expertise in Apache Kafka, Kafka Streams, and event-driven architectures.
  • Proficiency in Python and/or Java for data pipeline development using Apache Beam SDK.
  • Experience with healthcare data standards (HL7, FHIR) and handling semi-structured data.
  • Hands-on experience with streaming frameworks (Apache Beam, Dataflow) for near-real-time ingestion.
  • Knowledge of file formats and compression (JSON, Avro, Parquet) for raw data storage.
  • Understanding of CDC patterns, incremental loading, and data versioning strategies.
  • Experience with Cloud Storage lifecycle management and cost optimization.

Required Knowledge, Skills, and Abilities

testing10

Job Overview

  • Posted date : 09 Apr 2026
  • Location : Bangalore, Karnataka
  • Experience : 5+ years
  • Job nature : Full-time

Company Information

KGiSL <p>Quas dolor dignissimos nesciunt beatae. Consectetur dolorem perspiciatis explicabo culpa rerum quas. Qui nesciunt omnis fugiat accusamus aperiam placeat.</p>
  • Name:KGiSL
  • Web : https://www.schulist.com
  • Email: schulist@mailinator.com
KGiSL

Senior GCP Data Engineer

Job Nature: Full-time

Send your resume to: prasanth@reqroots.com Contact: 7339602543

Whom we are looking for

<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>Senior GCP Data Engineer</title> </head> <body> <h3>Required Skills: </h3> <ul> <li>5+ years experience with GCP services (Cloud Storage, Pub/Sub, Dataflow, Dataproc, Cloud Composer) </li> <li>Strong expertise in Apache Kafka, Kafka Streams, and event-driven architectures. </li> <li>Proficiency in Python and/or Java for data pipeline development using Apache Beam SDK. </li> <li>Experience with healthcare data standards (HL7, FHIR) and handling semi-structured data. </li> <li>Hands-on experience with streaming frameworks (Apache Beam, Dataflow) for near-real-time ingestion. </li> <li>Knowledge of file formats and compression (JSON, Avro, Parquet) for raw data storage. </li> <li>Understanding of CDC patterns, incremental loading, and data versioning strategies. </li> <li>Experience with Cloud Storage lifecycle management and cost optimization. </li> </ul> </body> </html>

Experience Requirements

testing10