company logo

BIgData Hadoop Developer

Mumbai
Full-Time,Internship
Mid-Level: 4 to 6 years
1L - 10.1L (Per Year)
Posted on Dec 23 2022

Not Accepting Applications

About the Job

Skills

Python (Programming Language)
spark
Data Hadoop Eco-system
Hive along with RDMS Systems.

Skills and Qualifications

- Must have at least 3-5 years of experience in Data Hadoop Eco-system and it’s various components.

- Must have extensively worked with Python, Spark and Hive along with RDMS Systems.

- Understanding of data warehousing and data modelling techniques

- Knowledge of industry-wide analytical and visualization tools (i.e. Tableau etc.)

- Preferred to have knowledge of HBase, Kafka, ZooKeeper, or other Apache software

- Knowledge of core Java, UNIX,Linux, SQL, ORACLE and any scripting language

- Good interpersonal skills and positive attitude

Preferred Qualifications

- Degree in computer sciences, maths or engineering

- Expertise in ETL methodology of Data Extraction, Transformation and Load processing in corporate wide ETL solution design using Data Stage


Job Description:

Objectives of this Role :

- Develop and implement data pipelines that extracts, transforms and loads data into an information product that helps to inform the organization

in reaching strategic goals

- Work on ingesting, storing, processing and analyzing large data sets from RDBMS (Oracle/SQL/MySQL etc.) systems.

- Translate complex technical and functional requirements into detailed designs

- Investigate and analyze alternative solutions to data storing, processing etc. to ensure most streamlined approaches are implemented

Daily and Monthly Responsibilities :

- Develop and maintain data pipelines implementing ETL processes

- Build, operate, monitor, and troubleshoot Hadoop infrastructure.

- Write software to interact with HDFS and MapReduce.

- Take responsibility for Hadoop Eco-system development and implementation.

- Work closely with a data science team implementing data analytic pipelines

- Maintain security and data privacy working closely with Data Protection Officer internally

- Analyse a vast number of data stores and uncover insights

- Write software to ingest data into Hadoop ecosystem.

- Assess requirements and evaluate existing solutions.

- Develop documentation and playbooks to operate Hadoop infrastructure.

About the company

Nibodhah is aligned to the key trends shaping the world of work. Connecting talented executives with companies in need of their skills is what we do.

Industry

Staffing & Recruiting

Company Size

11-50 Employees

Headquarter

Ahmedabad

Other open jobs from Nibodhah