Inference in Hybrid Bayesian Networks with Deterministic Variables
Speakers
Details
The main goal of this presentation is to describe an architecture for solving large hybrid Bayesian networks (BNs) with discrete, continuous and deterministic variables. In the presence of deterministic variables, we have to deal with non-existence of joint densities. We represent deterministic conditional distributions using Dirac delta functions. Using the properties of Dirac delta functions, we can deal with a large class of deterministic functions. The architecture we develop is an extension of the Shenoy-Shafer architecture for discrete BNs.
A major hurdle in making inference in hybrid BNs is marginalization of continuous variables, which involves integrating combinations of conditional probability density functions (PDFs). In this paper, we suggest the use of mixture of polynomials (MOP) approximations of PDFs, which are similar in spirit to using mixtures of truncated exponentials (MTEs) approximations. MOP functions can be easily integrated, and are closed under combination and marginalization. This enables us to propagate MOP potentials in the extended Shenoy-Shafer architecture for inference in hybrid BNs that can include deterministic variables.
Additional Information
Prakash Shenoy (University of Kansas)