Bayesian updating in causal probabilistic networks by local computations
The focus of this paper is on the derivation of a partial order, with length l, for finding the l most probable composite hypotheses; where l typically is much smaller than the total number of composite hypotheses in a network.Previously, only the partial order of length two (ie, l = 2) in a singly connected Bayesian network could be efficiently derived without further restriction on network topologies and the increase in spatial complexity.In diagnostic Bayesian networks for technical diagnosis the conditional probability tables for most nodes with parents are logical OR's.For specific applications the data for the probability tables for all other nodes are provided by what is termed domain experts in whatever field is being modeled.That is, it is assumed that one and only one variable represented by the plurality of parent nodes will be in its positive state (i.e., the fault will occur).Thus a sum of the conditional probabilities for all variables represented by all parent nodes for any particular node will always be equal to one.For example, for each node with parent nodes, knowledge acquisition questions are developed which when answered will indicate the conditional probability of each parent node of the node.
In scientific literature Bayesian networks are referred to by various names: Bayes nets, causal probabilistic networks, Bayesian belief networks or simply belief networks.
A set of conditional probability tables, one for each node, defines the dependency between the nodes and its parents.
And, nodes without parents, sometimes called source nodes, have associated therewith a prior marginal probability table.
Parameters are estimated by a modified junction tree algorithm.
The approach is illustrated with the Alarm network.