Hadoop Developer

Primary Skills: Hadoop Ecosystem, Big Data, Google Cloud Dataflow(4-6 yrs of exp)

Experience:4-6 years

Secondary Skills: SQL

Requirements:

GCP Big Query
GCP Data Proc
Python, SQL

Job Description:

  • Build and maintain data management workflows. Build and maintain Data ingestion pipelines for batch, micro-batch and real time streaming on big query with Google Cloud
  • GCP Certified developer on Biq Query and Data Proc
  • Experience in building Data ingestion pipelines for batch, micro-batch and real time streaming on big data/Hadoop platforms
  • Hands on experience on Hadoop big data tools – HDFS, Hive, Presto, Apache Nifi, Sqoop, Spark, Log Stash, Elastic Search, Kafka & Pulsar
  • Experience in collecting data from Kafka/Pulsar message bus and transporting the data to public/private cloud platform using NiFi, Data Highway and log stash technologies
  • Experience in building CI/CD pipeline & Dev Ops is preferred
  • Development experience with Agile Scrum/Safe methodology