data science intern

  • Developed new algorithms, tested and improved existing algorithms that forecast electric load in smart grid, such as Neural Networks, Generalized Linear Models, and time series models.
  • Identified inefficiencies in the implementation of machine learning models after ETL and visualizing of raw unstructured data.
  • Problem Setter 
  • Created quizzes, Minor and Major Projects for students 

data science intern

  • Worked on a project to automatically scrape the specific elements from different course websites.
  • Used Supervised Machine Learning models to classify course-related content on the website.
  •  Latent Dirichlet Allocation was used for topic modeling to help identify the key drivers of these Sky Clubs (such as Food/Beverages, Friendliness of Staff , Service, etc) that aided in attaining high customer satisfaction ratings.
  • Random Forest and XGBoost models were used for detecting overall feature importance to help determine factors that had a significant impact in helping Delta achieve superior NPS scores.

data science intern

  • Analyzed and processed complex data sets using advanced querying, visualization and analytics tools.
  • Utilized Web scrapping technique to extract and organize data.
  • Prepared reports through data cleaning, data visualization, and writing insights
  • Updated data streamlining processes , resulting in 20% redundancy reduction.

data science intern

  • Learnt Python from scratch
  • Worked with various Machine Learning algorithms to create the best working models 
  • Got familiar with data visualization and performed data analysis on multiple projects 
  • The project involved conducting Sentiment Analysis of customer feedback from Delta Sky Club’s across various cities in the USA through the use of Machine Learning techniques. 

data science intern

  • Worked as a python developer.
  • Worked on data pre-processing , experience with database systems and AWS services.
  • Gained knowledge on the basics of Natural Language Processing.
  • Was exploring and figuring out solutions to analyse 1.5 TB of data.