Linear model selection and regularization

1 min read

Linear models has an advantage of interpretability

Most basic ones are fitted by least squares

How they can be improved ?

Subset selection

Selecting a specific sunset of predictors, also known as variable selection or feature selection

Shrinkage or regularization

a model is fit using all p predictors but coefficients of some are reduced toward 0

Ridge regression makes coefficients small

Lasso makes them 0

dimension reduction

projecting predictors to a lower dimensional space

PCA principal components analysis

find the directions if data along which the observations vary the most

variables should be scaled and centered to have mean 0

The vectors are unique, different software will produce the same results

PCR, use principal components as regression variables

Unsupervised, because there is no guarantee that directions best explain the pc will be useful to predict the response

Partial least squares

Supervised alternative to PCR

January 16, 2020