Expectation propagation is a general approach to deterministic approximate Bayesian inference for graphical models, although its literature is confined mostly to machine learning applications. We investigate the utility of expectation propagation in g...
Expectation propagation is a general approach to deterministic approximate Bayesian inference for graphical models, although its literature is confined mostly to machine learning applications. We investigate the utility of expectation propagation in generalised, linear, and mixed model settings. We show that, even though the algebra and computations are complicated, the notion of message passing on factor graphs affords streamlining of the required calculations and we list the algorithmic steps explicitly. Numerical studies indicate expectation propagation is marginally more accurate than a competing method for the models considered, but at the expense of bigger algebraic and computational overheads.
The machine learning technique known as expectation propagation is studied within the statistical context of generalized, linear and mixed models.