Abstract
Bayesian algorithms have lately been used in a large variety of applications. This paper proposes a new methodology for hyperparameter initialization in the Variational Bayes (VB) algorithm. We employ a dual expectation-maximization (EM) algorithm as the initialization stage in the VB-based learning. In the first stage, the EM algorithm is used on the given data set while the second EM algorithm is applied on distributions of parameters resulted from several runs of the first stage EM. The graphical model case study considered in this paper consists of a mixture of Gaussians. Appropriate conjugate prior distributions are considered for modelling the parameters. The proposed methodology is applied on blind source separation of modulated signals.
Original language | English |
---|---|
Title of host publication | 2003 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOLS 1-5, CONFERENCE PROCEEDINGS |
Place of Publication | NEW YORK |
Publisher | IEEE |
Pages | 474-479 |
Number of pages | 6 |
ISBN (Print) | 0-7803-7952-7 |
Publication status | Published - 2003 |
Event | IEEE International Conference on Systems, Man and Cybernetics (SMC 03) - WASHINGTON Duration: 5 Oct 2003 → 8 Oct 2003 |
Conference
Conference | IEEE International Conference on Systems, Man and Cybernetics (SMC 03) |
---|---|
City | WASHINGTON |
Period | 5/10/03 → 8/10/03 |
Keywords
- Gaussian mixtures
- Bayesian inference
- variational learning
- expectation-maximization algorithm
- MODELS