Hadoop Ecosystem

Position: Hadoop Ecosystem

Work Location: India

 Duration:  6 – 12 Months Contract to Hire

Experience: 6+ Years

 Job Description:

Spark,Python,Scala

● At least 8 years of experience building and managing complex products/solutions.

● Minimum 5 years experience on Hadoop Ecosystem (Spark/Scala/Python preferred) & Backend software modules using Scala / Spark & java

● Minimum 5 years experience in shell scripting

● Minimum 8 years of experience working in Linux / Unix environment.

● Experience in Design and development of data ingestion services that can ingest 10s of TB of data

● Expert level programming in Java, Scala & Python.

● Experience in developing ETL modules for the AI/ML use cases, developing algorithms & testing

● Minimum 5 Years of experience on performance optimizations on Spark, Hadoop, Any NoSQL

● Minimum 5 Years of experience on Testing and Debugging Data pipelines based on Hadoop and Spark

● Experience with debugging production issues & performance scenarios

Thanks & Regards

Mounica Peddi

mounica@burgeonits.com

+91- 8297300040

Scroll to Top