Share Email Print

Proceedings Paper

Direct message passing for hybrid Bayesian networks and performance analysis
Author(s): Wei Sun; K. C. Chang
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Probabilistic inference for hybrid Bayesian networks, which involves both discrete and continuous variables, has been an important research topic over the recent years. This is not only because a number of efficient inference algorithms have been developed and used maturely for simple types of networks such as pure discrete model, but also for the practical needs that continuous variables are inevitable in modeling complex systems. Pearl's message passing algorithm provides a simple framework to compute posterior distribution by propagating messages between nodes and can provides exact answer for polytree models with pure discrete or continuous variables. In addition, applying Pearl's message passing to network with loops usually converges and results in good approximation. However, for hybrid model, there is a need of a general message passing algorithm between different types of variables. In this paper, we develop a method called Direct Message Passing (DMP) for exchanging messages between discrete and continuous variables. Based on Pearl's algorithm, we derive formulae to compute messages for variables in various dependence relationships encoded in conditional probability distributions. Mixture of Gaussian is used to represent continuous messages, with the number of mixture components up to the size of the joint state space of all discrete parents. For polytree Conditional Linear Gaussian (CLG) Bayesian network, DMP has the same computational requirements and can provide exact solution as the one obtained by the Junction Tree (JT) algorithm. However, while JT can only work for the CLG model, DMP can be applied for general nonlinear, non-Gaussian hybrid model to produce approximate solution using unscented transformation and loopy propagation. Furthermore, we can scale the algorithm by restricting the number of mixture components in the messages. Empirically, we found that the approximation errors are relatively small especially for nodes that are far away from the discrete parent nodes. Numerical simulations show encouraging results.

Paper Details

Date Published: 27 April 2010
PDF: 9 pages
Proc. SPIE 7697, Signal Processing, Sensor Fusion, and Target Recognition XIX, 76970S (27 April 2010); doi: 10.1117/12.852088
Show Author Affiliations
Wei Sun, George Mason Univ. (United States)
K. C. Chang, George Mason Univ. (United States)

Published in SPIE Proceedings Vol. 7697:
Signal Processing, Sensor Fusion, and Target Recognition XIX
Ivan Kadar, Editor(s)

© SPIE. Terms of Use
Back to Top