Ed George, University of Texas
Empirical Bayes Methods For The Variable Selection Problem
For
the problem of variable selection for the normal linear model, fixed
penalty selection criteria such as AIC, Cp, BIC and RIC are shown to
select maximum posterior models under implicit hyperparameter choices
for a popular hierarchical Bayes formulation. Motivated by this
formulation, we propose two empirical Bayes selection criteria, MML
and CML, which use hyperparameter estimates instead of fixed choices.
As opposed to the traditional criteria with fixed dimensionality
penalties, these criteria have dimensionality penalties that depend on
the data. Their performance is seen to adaptively approximate the
performance of the best fixed penalty criterion across a variety of
setups. For the problem of data compression and denoising with
wavelets, these methods are seen to offer improved performance over
fixed hyperparameter Bayes and classical estimators. Extensions to
heavy-tailed error distributions are are also described.