Search

CloseClose

Do you really want to create a new entry?

Offices and unitsDemographicsPartiesRegionsSettlementsPlacesPeopleArticles

Create new

# Information criteria for evaluation regression models

When we have data, we use information criteria to compare different regression models based on that data, using some or all of the variables. Below are the quantities we will use,

Symbol

Description

The number of observations in the sample.

The number of coefficients for the model we are considering. This includes the intercept, so it is usually equal to the number of variables + the one intercept.

The SSE for the model we are considering.

The SSE for the model with all of the variables.

The MSE for the model we are considering.

The MSE for the model with all of the variables.

And the different criteria often used,

Criteria

Formula

Description

R-squared

It is worth noting that the R-squared value is in itself a useful consideration when comparing regression models.

The advantage of the adjusted R-squared is that it adds a penalty when we add more predictors to the model. Nonetheless, the same principle holds that the larger the R-squared, the stronger the model. Alternatively, we can calculate it as,

Mallow's

We want a that is less than (and not equal to) , conveying that the model is unbiased. In other words, we just compare the to the number of variables in the model (plus one, for the intercept).

Akaike’s Information Criterion (AIC)

Bayesian Information Criterion (BIC)

Also called Schwartz’s Bayesian Criterion (SBC).

Amemiya’s Prediction Criterion (APC)