Variational message passing

Variational message passing

References: Fix "not found" error with Cite Q

← Previous revision Revision as of 10:09, 20 April 2026
Line 43: Line 43:
Because every child must be conjugate to its parent, this has limited the types of distributions that can be used in the model. For example, the parents of a [[Gaussian distribution]] must be a [[Gaussian distribution]] (corresponding to the [[Mean]]) and a [[gamma distribution]] (corresponding to the precision, or one over \sigma in more common parameterizations). Discrete variables can have [[Dirichlet distribution|Dirichlet]] parents, and [[Poisson distribution|Poisson]] and [[Exponential distribution|exponential]] nodes must have [[gamma distribution|gamma]] parents. More recently, VMP has been extended to handle models that violate this conditional conjugacy constraint.{{cite journal |last1=Knowles |first1=David A. |last2=Minka |first2=Thomas P. |date=2011 |title=Non-conjugate Variational Message Passing for Multinomial and Binary Regression |journal=NeurIPS |volume= |issue= |pages= |doi= |access-date= | url=https://proceedings.neurips.cc/paper/2011/file/5c936263f3428a40227908d5a3847c0b-Paper.pdf}}
Because every child must be conjugate to its parent, this has limited the types of distributions that can be used in the model. For example, the parents of a [[Gaussian distribution]] must be a [[Gaussian distribution]] (corresponding to the [[Mean]]) and a [[gamma distribution]] (corresponding to the precision, or one over \sigma in more common parameterizations). Discrete variables can have [[Dirichlet distribution|Dirichlet]] parents, and [[Poisson distribution|Poisson]] and [[Exponential distribution|exponential]] nodes must have [[gamma distribution|gamma]] parents. More recently, VMP has been extended to handle models that violate this conditional conjugacy constraint.{{cite journal |last1=Knowles |first1=David A. |last2=Minka |first2=Thomas P. |date=2011 |title=Non-conjugate Variational Message Passing for Multinomial and Binary Regression |journal=NeurIPS |volume= |issue= |pages= |doi= |access-date= | url=https://proceedings.neurips.cc/paper/2011/file/5c936263f3428a40227908d5a3847c0b-Paper.pdf}}

== Literature ==
* {{Cite Q | Q139488859 }}
*{{Cite thesis |type=PhD |title=Variational Algorithms for Approximate Bayesian Inference |url=http://www.cs.toronto.edu/~beal/thesis/beal03.pdf |last=Beal |first=M.J. |year=2003 |publisher=Gatsby Computational Neuroscience Unit, University College London |access-date=2007-02-15 |archive-url=https://web.archive.org/web/20050428173705/http://www.cs.toronto.edu/~beal/thesis/beal03.pdf |archive-date=2005-04-28 |url-status=dead }}


==References==
==References==
{{Reflist}}
{{Reflist}}
*{{cite journal |first1=J.M. |last1=Winn |first2=C. |last2=Bishop |title=Variational Message Passing |journal=Journal of Machine Learning Research |volume=6 |pages=661–694 |year=2005 |url=http://www.johnwinn.org/Publications/papers/VMP2004.pdf |format=PDF}}
*{{Cite thesis |type=PhD |title=Variational Algorithms for Approximate Bayesian Inference |url=http://www.cs.toronto.edu/~beal/thesis/beal03.pdf |last=Beal |first=M.J. |year=2003 |publisher=Gatsby Computational Neuroscience Unit, University College London |access-date=2007-02-15 |archive-url=https://web.archive.org/web/20050428173705/http://www.cs.toronto.edu/~beal/thesis/beal03.pdf |archive-date=2005-04-28 |url-status=dead }}


==External links==
==External links==