When we have data, we use information criteria to compare different regression models based on that data, using some or all of the variables. Below are the quantities we will use,
The number of observations in the sample.
The number of coefficients for the model we are considering. This includes the intercept, so it is usually equal to the number of variables + the one intercept.
The SSE for the model we are considering.
The SSE for the model with all of the variables.
The MSE for the model we are considering.
The MSE for the model with all of the variables.
And the different criteria often used,
It is worth noting that the R-squared value is in itself a useful consideration when comparing regression models.
The advantage of the adjusted R-squared is that it adds a penalty when we add more predictors to the model. Nonetheless, the same principle holds that the larger the R-squared, the stronger the model. Alternatively, we can calculate it as,
We want a
Akaike’s Information Criterion (AIC)
Bayesian Information Criterion (BIC)
Also called Schwartz’s Bayesian Criterion (SBC).
Amemiya’s Prediction Criterion (APC)