Linear Regression Part 3 – Evaluation of the model

Check out Part 1 and Part 2 of the series before going further

Linear Regression Part 1 – Assumption and Basics of LR
Linear Regression Part 2 – Code and implementation of model

If you want to learn more about Linear Regression and ARIMA forecasting in R with 100 questions then you can try our book on Amazon ‘100 Linear Regression and ARIMA forecasting questions in R’


Accuracy is not the only measure to evaluate a Linear Regression model. There multiple ways in which you can evaluate an LR model, we will discuss four out of these:-
1. R Square
2. Adjusted R-Square
3. F-Test
4. RMSE

SST i.e. Sum of Squares Total – How far the data are from the mean
SSE i.e. Sum of Squares Error – How far the data are from the model’s predicted values

R Squared = (SST-SSE)/SST
It indicates the goodness of fit of the model.
R-squared has the useful property that its scale is intuitive: it ranges from zero to one, with zero indicating that the proposed model does not improve prediction over the mean model, and one indicating perfect prediction. Improvement in the regression model results in proportional increases in R-squared.

More the predictor, better the R-Squared error. Is this statement true? If, Yes then how to counter this?
This is true, that’s why we do not use R-Squared error as a success metric for models with a lot of predictor variables. The adjusted R-squared is a modified version of R-squared that has been adjusted for the number of predictors in the model. The adjusted R-squared increases only if the new term improves the model more than would be expected by chance. It decreases when a predictor improves the model by less than expected by chance. The adjusted R-squared can be negative, but it’s usually not.  It is always lower than the R-squared.

Hence, if you are building Linear regression on multiple variable, it is always suggested that you use Adjusted R-squared to judge goodness of model.

Rsquared measures the proportion of the variation in your dependent variable (Y) explained by your independent variables (X) for a linear regression model. Adjusted Rsquared adjusts the statistic based on the number of independent variables in the model.

Adjusted R-squared will decrease as predictors are added if the increase in model fit does not make up for the loss of degrees of freedom.

3. F-Test
The F-test evaluates the null hypothesis that all regression coefficients are equal to zero versus the alternative that at least one is not. An equivalent null hypothesis is that R-squared equals zero

4. RMSE

Root Mean Square Error takes the difference between the predicted and actual value and square it before dividing the value with the total number of terms.

As you can observe, the RMSE penalizes the difference in prediction quite heavily by doing a square of the difference.

There are other methods also to determine the performance of the Linear Model. These three articles will definitely help you to kick-start your “modeling career”

Keep Learning 🙂

The Data Monk