Employee_Attrition An Industrial Example

Last Updated on May 3, 2021

About

Predicted the employees who will leave the company in advance using historic data. Used Decision tree Linear Regression, Random Forest, XGB, Ada Boosting and Neural Networks on processed data to predict the target variable. From the observation of all AUCROC values, we had concluded that the XGB Model had a maximum value of AUCROC.

More Details: Employee_Attrition an Industrial Example

Submitted By


Share with someone who needs it

False Alarm Detection System

Last Updated on May 3, 2021

About

This project was made for a chemical industry which had sensors installed in various parts of the factory to detect H2S gas which is hazardous to health. Every time one or multiple sensors detected the H2S leak, an emergency alarm rings to alert the workers. For every alarm, the industry calls a team which sanitizes the place and checks for the leak and this was a big cost to the company.

A few of the alarms that ring are not even hazardous. The company gave us the data for each alarm with a final column stating the alarm was dangerous or not.

Ambient Temperature

Calibration(days)

Unwanted substance deposition (0/1)

Humidity (%)

H2S Content(ppm)

Dangerous (0/1)


 

The data was first pre-processed and analysis libraries like Numpy and Pandas were used to make it ready to be utilized by a machine learning algorithm.

Problems like standard scaling, categorical data and missing values were handled with appropriate techniques.

Then, we used Logistic Regression model to make a classifier with first five column as independent columns and dangerous column as dependent/target column.

Now whenever, there is a leakage and the alarm rings, the data is sent to us and we predict if it is dangerous or not. If found dangerous then only the team is called to sanitize the place and fix the leak. This saved a lot of money for the company. 

More Details: False Alarm Detection System

Submitted By


Machine Learning (Heart Disease Prediction Model)

Last Updated on May 3, 2021

About

This is web based API model which predicts the probability of having a heart disease

Here I had a dataset of few patients where I had information like CRF, Hypothrodism, HT,DM.

I have splitted the data so that I can train , and then test our prediction by finding out accuracy using various Python Algorithms.

The library used here are numpy , matplotlib, pandas, sklearn and pickle of Python.

I preprocessed the data and performed various splitting options.

I observed various plots using library matplotlib.

I have used numpy and pandas to to read the data and observe various statistical things.

I have used various algorithms like:

Random forest ( model file in github as modelRF.py)

Decision tree ( modelDT.py).

SVM (modelSVM.py)

ANN (modelANN.py)

Naive Bayes (modelNB.py)

In each algorithm I fitted my training data, saved model to the disk , loaded the model using Pickle library and then finally compared the result .

All the accuracy was found out for each algorithm and all of them showed accuracy greater than 85%.

All this model building was done in model.py files , modelNB (naive bayes) modelSVM (support vector machine) etc . according to the algorithm

After finding accuracy from every algorithm.

I finally built a model using library flask , request,jsonify,render_template ,keras and loaded the model using pickle .

The final features of the model was predicted and finally created as app.py.

As the model runs on local host we also added various html tags and styling using CSS to make it more presentable.

The code is shared freely on Github platform.

Link added below

More Details: Machine Learning (Heart Disease prediction model)

Submitted By


Indian Railways

Last Updated on May 3, 2021

About

- Implement and design an Indian Railway Website which I started from 13th March to 20st March 2021.

- In this website user can able to fetch the proper trains schedule in all over the India like their Arrival Time, Departure Time , Number of Stoppage with their Station Name and Code .

- Applied HTML, CSS, JavaScript and Bootstrap as well as Rest API to fetch all the detail schedule of various train in India.

- This website takes the Train Number and Date as an input from the user which he/she want to fetch the details of that particular train and after clicking the Get Schedule button , a number of Flex Cards appeared on the screen on the basis of number of stoppage including the Source and Destination station.

- This cards have two faces one is front which contains the Serial no. at the top and Station Name with their Code at below and another is back , which includes the no. of Days , Arrival Time and Departure Time.

- By default it shows the front side of the card but on hover the card it shows the back side details.

- This project also includes some CSS Animation and Live Timer Clock in the middle of the page.


More Details: Indian Railways

Submitted By


Hacktube

Last Updated on May 3, 2021

About

A Chrome extension that fights online harassment by filtering out comments with strong language.

Inspiration

YouTube is a place for millions of people to share their voices and engage with their communities. Unfortunately, the YouTube comments section is notorious for enabling anonymous users to post hateful and derogatory messages with the click of a button. These messages are purely meant to cause anger and depression without ever providing any constructive criticism. For YouTubers, this means seeing the degrading and mentally-harmful comments on their content, and for the YouTube community, this means reading negative and offensive comments on their favorite videos. As young adults who consume this online content, we feel as though it is necessary to have a tool that combats these comments to make YouTube a safer place.

What it does

HackTube automatically analyzes every YouTube video you watch, targeting comments which are degrading and offensive. It is constantly checking the page for hateful comments, so if the user loads more comments, the extension will pick those up. It then blocks comments which it deems damaging to the user, listing the total number of blocked comments at the top of the page. This process is all based on user preference, since the user chooses which types of comments (sexist, racist, homophobic, etc) they do not want to see. It is important to note that the user can disable the effects of the extension at any time. HackTube is not meant to censor constructive criticism; rather, it combats comments which are purely malicious in intent.

How we built it

HackTube uses JavaScript to parse through every YouTube comment almost instantly, comparing its content to large arrays that we made which are full of words that are commonly used in hate speech. We chose our lists of words carefully to ensure that the extension would focus on injurious comments rather than helpful criticism. We used standard HTML and CSS to style the popup for the extension and the format of the censored comments.

Challenges we ran into

We are trying to use cookies to create settings for the user which would be remembered even after the user closes the browser. That way anyone who uses HackTube will be able to choose exactly which types of comments they don't want to see and then have those preferences remembered by the extension. Unfortunately, Chrome blocks the use of cookies unless you use a special API, and we didn't have enough time to complete our implementation of that API at this hackathon.

Accomplishments that we're proud of

We are proud of making a functional product that can not only fight online harassment and cyberbullying but also appeal to a wide variety of people.

What we learned

We learned how to dynamically alter the source code of a webpage through a Chrome extension. We also learned just how many YouTube comments are full of hate and malicious intent.

What's next for HackTube

Right now, for demo purposes, HackTube merely changes the hateful comments into a red warning statement. In the future, HackTube will have an option to fully take out the malicious comment, so users’ YouTube comments feed will be free of any trace of hateful comments. Users won’t have to worry about how many comments were flagged and what they contained. Additionally, we will have a way for users to input their own words that offend them and take the comments that contain those words out of the section.

More Details: HackTube

Submitted By


Long Term Tool

Last Updated on May 3, 2021

About

My previous project was shear project project that is Long term tool .This tool is used by wind farm owners who want to know in which location it is going to give best profits.

Suppose A wants to start a wind farm business A is having money but he is not aware of wind speeds at particular location ,so he took help from B (The wind pioneers) wind pioneers uses sensor for every wind station to find the wind speed and wind direction. Here wind pioneers role is to record the data which contain wind speeds and wind directions for every hour.

wind pioneers measuring wind speeds at various heights of sensor like ws_120m,ws_100m. For each minute we have some observations ,for every hour the number of observations will increases ,so it is very large data to deal. so we cannot do manual calculations for analyzing this big data. So here we come up with one tool that is long term tool.

I worked on this project along with team this tool provide you interactive software for performing all the analysis like plots, correlation values, scatter plots for finding relationship between two variables. You can just simply download the files that you are working for. It will going to give you everything in detail.

Here we are taking Reference data as NASA data of past 30 years which contains wind speed and wind direction In order to predict the wind speeds of particular location for next 30 years by making use of linear regression model .

Here we are predicting wind speeds of next 30 years for particular location by taking reference data as NASA data.

We are performing linear model for various time periods 1hr,6hr,1 day,3day,7day,10 day,1 month. Again sometimes your weather file and climate file may be differ with time In order to compensate time period we are using time shifting for reference file.



More Details: long term tool

Submitted By