Missing Value treatment is no doubt one of the most important parts of the whole process of building a model. Why?Because we can't afford to eliminate rows wherever there is a missing value in any of the columns. ...
Continue readingStory of Bias, Variance, Bias-Variance Trade-Off
Why do we predict?We predict in order to identify the trend of the future by using our sample data set. Whenever we create a model, we try to create a formula out of our sample data set. And ...
Continue readingMulticollinearity in Simple Terms
We all know the definition of multi-collinearity i.e. when 2 or more explanatory variable in multi regression model are highly linearly related then it's called multicollinearityExample - Age and Selling price of a CarEducation and Annual IncomeHeight and ...
Continue readingGRE Verbal | Barron’s 800 Destroyed | Day 10
We are already good with 220 words, lets's pass that 250 mark. All the words given below are directly from Barron's 800 most frequent words. 251. abeyance - temporary suspensionIf you have ever created an ...
Continue reading5 Most Important SQL questions for your Data Science Interview
SQL is the bread and butter of an analyst. You can't survive in the Data Science industry with a grip on this 'easy-looking' query language. I have been interviewed for more than 30 companies in the past 3-4 ...
Continue readingCross Validation and varImp in R
I was onto our next book - Linear,Ridge, LAASO, and Elastic Net Algorithm explained in layman terms with code in R , when we thought of covering the simple concepts which are quite helpful while creating models.Cross Validation ...
Continue readingRidge vs LASSO vs Elastic Net Regression
Ridge and LASSO are two important regression models which comes handy when Linear Regression fails to work.This topic needed a different mention without it's important to understand COST function and the way it's calculated for Ridge,LASSO, and any ...
Continue readingLinear, LASSO, Elastic Net, and Ridge Regression in Layman terms (R) with complete code – Part 1
Linear, LASSO, Elastic Net, and Ridge Regression are the four regression techniques which are helpful to predict or extrapolate the prediction using the historic data. Linear doesn't have any inclination towards the value of lambda.LASSO ...
Continue readingOne Hot Encoding – Feature Engineering
So, I just started solving the latest Hackathon on Analytics Vidhya, Women in the loop . Be it a real-life Data Science problem or a Hackathon, one-hot encoding is one of the most important part of ...
Continue readingFeature Engineering in Data Science
Have you ever wondered why two different people gets different accuracy while using the same algorithm?We all know that XGBoost can help us get a very good result in our Hackathons, but then also only few people achieve ...
Continue reading