Variational Gaussian mixtures for blind source detection

N Nasios, A G Bors

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Bayesian algorithms have lately been used in a large variety of applications. This paper proposes a new methodology for hyperparameter initialization in the Variational Bayes (VB) algorithm. We employ a dual expectation-maximization (EM) algorithm as the initialization stage in the VB-based learning. In the first stage, the EM algorithm is used on the given data set while the second EM algorithm is applied on distributions of parameters resulted from several runs of the first stage EM. The graphical model case study considered in this paper consists of a mixture of Gaussians. Appropriate conjugate prior distributions are considered for modelling the parameters. The proposed methodology is applied on blind source separation of modulated signals.

Original languageEnglish
Title of host publication2003 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOLS 1-5, CONFERENCE PROCEEDINGS
Place of PublicationNEW YORK
PublisherIEEE
Pages474-479
Number of pages6
ISBN (Print)0-7803-7952-7
Publication statusPublished - 2003
EventIEEE International Conference on Systems, Man and Cybernetics (SMC 03) - WASHINGTON
Duration: 5 Oct 20038 Oct 2003

Conference

ConferenceIEEE International Conference on Systems, Man and Cybernetics (SMC 03)
CityWASHINGTON
Period5/10/038/10/03

Keywords

  • Gaussian mixtures
  • Bayesian inference
  • variational learning
  • expectation-maximization algorithm
  • MODELS

Cite this