Expectation propagation
Expectation propagation (EP) is a technique in Bayesian machine learning.[1]
EP finds approximations to a probability distribution.[1] It uses an iterative approach that uses the factorization structure of the target distribution.[1] It differs from other Bayesian approximation approaches such as variational Bayesian methods.[1]
More specifically, suppose we wish to approximate an intractable probability distribution with a tractable distribution . Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence .[1] Variational Bayesian methods minimize instead.[1]
If is a Gaussian , then is minimized with and being equal to the mean of and the covariance of , respectively; this is called moment matching.[1]
Applications
Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.
References
- Thomas Minka (August 2–5, 2001). "Expectation Propagation for Approximate Bayesian Inference". in Jack S. Breese, Daphne Koller. UAI '01: Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence. University of Washington, Seattle, Washington, USA. pp. 362–369. http://research.microsoft.com/en-us/um/people/minka/papers/ep/minka-ep-uai.pdf.
External links
![]() | Original source: https://en.wikipedia.org/wiki/Expectation propagation.
Read more |