|Location||Bengaluru, KAR, INDIA|
- 3-5 years with mandatory GCP experience.
- Experience in coding data pipeline on GCP.
- Prior experience on Hadoop systems. Strong on programming languages like Scala, Python, Java.
- Good understanding of various data storage formats and its advantages. Should have exposure on GCP tools to develop end to end data pipeline for various scenarios (including ingesting data from traditional data bases as well as integration of API based data sources Data development with exposure on implementation complex data science solutions in production
- Python, R o Java.
- Store : CloudSQL , Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore
- Ingest: Stackdriver, Pub/Sub, AppEngine, Kubernete Engine, Kafka, DataPrep , Micro services
- Schedule: Cloud Composer
- Processing: Cloud Dataproc, Cloud Dataflow, Cloud Dataprep
- CI/CD – Bitbucket+Jenkinjs / Gitlab
- Atlassian Suite