Result: Two entropy-based methods for learning unsupervised Gaussian mixture models

Title:
Two entropy-based methods for learning unsupervised Gaussian mixture models
Source:
Structural, syntactic, and statistical pattern recognition (joint IAPR international workshops, SSPR 2006 and SPR 2006, Hong Kong, China, August 17-19, 2006)0SSPR 2006. :649-657
Publisher Information:
Berlin: Springer, 2006.
Publication Year:
2006
Physical Description:
print, 18 ref 1
Original Material:
INIST-CNRS
Subject Terms:
Computer science, Informatique, Sciences exactes et technologie, Exact sciences and technology, Sciences appliquees, Applied sciences, Informatique; automatique theorique; systemes, Computer science; control theory; systems, Logiciel, Software, Organisation des mémoires. Traitement des données, Memory organisation. Data processing, Traitement des données. Listes et chaînes de caractères, Data processing. List processing. Character string processing, Intelligence artificielle, Artificial intelligence, Reconnaissance des formes. Traitement numérique des images. Géométrie algorithmique, Pattern recognition. Digital image processing. Computational geometry, Algorithme EM, EM algorithm, Algoritmo EM, Algorithme optimal, Optimal algorithm, Algoritmo óptimo, Analyse statistique, Statistical analysis, Análisis estadístico, Analyse structurale, Structural analysis, Análisis estructural, Analyse syntaxique, Syntactic analysis, Análisis sintáxico, Apprentissage non supervisé, Unsupervised learning, Contrôle qualité, Quality control, Control calidad, Densité probabilité, Probability density, Densidad probabilidad, Entropie, Entropy, Entropía, Estimation densité, Density estimation, Estimación densidad, Estimation paramètre, Parameter estimation, Estimación parámetro, Fonction densité probabilité, Probability density function, Función densidad probabilidad, Image couleur, Color image, Imagen color, Initialisation, Initialization, Inicialización, Maximum vraisemblance, Maximum likelihood, Maxima verosimilitud, Mesure densité, Density measurement, Medición densidad, Modélisation, Modeling, Modelización, Reconnaissance forme, Pattern recognition, Reconocimiento patrón, Reconnaissance image, Image recognition, Reconocimiento imagen, Système information, Information system, Sistema información, Théorie mélange, Mixture theory, Teoría mezcla
Document Type:
Conference Conference Paper
File Description:
text
Language:
English
Author Affiliations:
Robot Vision Group Alicante University, Spain
ISSN:
0302-9743
Rights:
Copyright 2007 INIST-CNRS
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Notes:
Computer science; theoretical automation; systems
Accession Number:
edscal.19152020
Database:
PASCAL Archive

Further Information

In this paper we address the problem of estimating the parameters of a Gaussian mixture model. Although the EM (Expectation-Maximization) algorithm yields the maximum-likelihood solution it requires a careful initialization of the parameters and the optimal number of kernels in the mixture may be unknown beforehand. We propose a criterion based on the entropy of the pdf (probability density function) associated to each kernel to measure the quality of a given mixture model. Two different methods for estimating Shannon entropy are proposed and a modification of the classical EM algorithm to find the optimal number of kernels in the mixture is presented. We test our algorithm in probability density estimation, pattern recognition and color image segmentation.