Abstract
A general method for defining informative priors
on statistical models is presented and applied
specifically to the space of classification and regression
trees. A Bayesian approach to learning such
models from data is taken, with the Metropolis-
Hastings algorithm being used to approximately
sample from the posterior. By only using proposal
distributions closely tied to the prior, acceptance
probabilities are easily computable via marginal
likelihood ratios, whatever the prior used. Our approach
is empirically tested by varying (i) the data,
(ii) the prior and (iii) the proposal distribution. A
comparison with related work is given.
Original language | English |
---|---|
Title of host publication | Proceedings of the Nineteenth International Joint Conference on Artificial Intelligence |
Publisher | Professional Book Center |
Pages | 641-646 |
Number of pages | 5 |
ISBN (Print) | 0938075934 |
Publication status | Published - 2005 |
Event | IJCAI-05 - Edinburgh, Scotland Duration: 5 Jul 2005 → … |
Conference
Conference | IJCAI-05 |
---|---|
City | Edinburgh, Scotland |
Period | 5/07/05 → … |