Statistics - Akaike information criterion (AIC)

> (Statistics|Probability|Machine Learning|Data Mining|Data and Knowledge Discovery|Pattern Recognition|Data Science|Data Analysis)

1 - About

AIC stands for Akaike Information Criterion.

Akaike is the name of the guy who came up with this idea.

AIC is a quantity that we can calculate for many different model types, not just linear models, but also classification model such logistic regression and so on.

Advertising

3 - Definition

The AIC criterion is defi ned for a large class of models fi t by maximum likelihood:

<MATH> AIC = -2 log L + 2 . d </MATH>

where:

  • L is the maximized value of the likelihood function for the estimated model.
  • d is the total # of parameters used in the model (regression coefficients + intercept)

4 - Linear model

It turns out that in the case of a linear model with Gaussian errors, negative 2 log L is just equal to RSS over <math>\hat{\sigma}</math> squared

<MATH> 2 log L = \frac{\href{RSS}{RSS}}{\href{variance}{\hat{\sigma}}^2} </MATH>

where:

Then by plugging it in the aboe formula, we can see that AIC and Mallow's Cp are actually proportional to each other. They are the same thing for linear models.