We present a general framework for defining priors on model structure and sampling from the posterior using the Metropolis-Hastings algorithm. The key ideas are that structure priors are defined via a probability tree and that the proposal distribution for the Metropolis-Hastings algorithm is defined using the prior, thereby defining a cheaply computable acceptance probability. We have applied this approach to Bayesian net structure learning using a number of priors and proposal distributions. Our results show that these must be chosen appropriately for this approach to be successful.
|Title of host publication||Proceedings of the Seventeenth Annual Conference on Uncertainty in Artificial Intelligence (UAI--2001)|
|Editors||Jack Breese, Daphne Koller|
|Place of Publication||Seattle|
|Publisher||MORGAN KAUFMANN PUB INC|
|Number of pages||8|
|Publication status||Published - 1 Aug 2001|