Bayes Factors via Savage-Dickey Supermodels [IMA]
How could I possibly resist reblogging an arXiver post about “Savage-Dickey Supermodels”?
http://arxiv.org/abs/1609.02186
We outline a new method to compute the Bayes Factor for model selection which bypasses the Bayesian Evidence. Our method combines multiple models into a single, nested, Supermodel using one or more hyperparameters. Since the models are now nested the Bayes Factors between the models can be efficiently computed using the Savage-Dickey Density Ratio (SDDR). In this way model selection becomes a problem of parameter estimation. We consider two ways of constructing the supermodel in detail: one based on combined models, and a second based on combined likelihoods. We report on these two approaches for a Gaussian linear model for which the Bayesian evidence can be calculated analytically and a toy nonlinear problem. Unlike the combined model approach, where a standard Monte Carlo Markov Chain (MCMC) struggles, the combined-likelihood approach fares much better in providing a reliable estimate of the log-Bayes Factor. This scheme potentially opens the way to…
View original post 53 more words
September 12, 2016 at 5:31 pm
Ah, I have already commented this paper, but the post will only appear tomorrow!
September 12, 2016 at 5:42 pm
I look forward to reading it!
September 12, 2016 at 5:43 pm
Expect the usual Gallic pessimistic take on everything…! Especially Savage-Dickey.
September 12, 2016 at 5:50 pm
Wouldn’t you rather have a Cox-Zucker Machine?