

Not in love with this template? Browse our full library of resume templates
Related Resumes & Cover Letters
More Job Descriptions for :
1
hadoop developer
- Work with the application team to design and develop an effective Hadoop solution. Be actively engaged and responsible in the development process
- Work to tight deadlines and provide regular progress updates against agreed milestones
- Loading data onto the cluster, building orchestration jobs using sqoop/flume and writing HIVE queries to support analysis
- The data volume is around 750 Gb and is an apt case for Hadoop like computation. Using loading utilities like Sqoop, data is loaded onto clusters and cleaned.
- Thereafter various business defined algorithm is applied to data loaded in HIVE tables to come up with an indicator variable
2
hadoop developer
- Project Name: NPM (NH21 component & YUKON component)
- Organization : Harman
- Client : British Telecom
- Role: Hadoop Developer
- Hadoop based project for performance monitoring and reporting of network.NPM solution is designed to monitor, optimize and utilize the network effectively.
- It monitors the performance of various network devices. NPM performs hourly, daily and monthly aggregations on a large number of metrics received for each network sub-element it monitors.
- These insights help the Network Operations Centre (NOC) team to monitor and maintain performance of the network.
3
hadoop developer
- Worked as a Hadoop developer for reliance Jio Media apps like Jio TV, Jio Music,Jio Video On Demand, Jio News and Jio Magazines. Collecting the data when user uses respective application and storing that live data in data bases and showing the User behavior on Dashboards using elastic search.
- Provide the end to end implementation of data from different internal system to the Hadoop Data Lake using Big data tools like Kafka, Flume and Hive.
- Responsible for Technical Design Preparation/Updating.
- Daily Status Update and followed Scrum processes.
4
hadoop developer
- Analyzed the Hortonworks Hadoop Architecture of production servers and clusters.
- Worked on analyzing Hadoop cluster and different big data analytic tools including Oozie, Hbase, Hive, NoSQL database and Sqoop.
- Importing and Exporting data from different sources such as MySql, Sql Server, Local to HDFS and Hive and vice-versa.
- Written Hive UDFS to extract data from staging tables.
- Written Linux Shell Scripts to automate Sqoop commands and Oozie workflows to Import multiple tables at onces to Hive.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in mapreduce way.
- Written efficient Oozie workflows, sub-workflows and coordinators for data importing and exporting
5
hadoop developer
- Data Migration from IBM Netezza to Hadoop
- Developed data pipeline from Level 1 to Level 3 development.
- Experience in writing scripts using HQL,
- Developed API’s using apache-spark, SBT, and IntelliJ idea.