Result: Kernelized vector quantization in gradient-descent learning : Advances in Self-Organizing Maps

Title:
Kernelized vector quantization in gradient-descent learning : Advances in Self-Organizing Maps
Source:
Neurocomputing (Amsterdam). 147:83-95
Publisher Information:
Amsterdam: Elsevier, 2015.
Publication Year:
2015
Physical Description:
print, 96 ref
Original Material:
INIST-CNRS
Subject Terms:
Cognition, Computer science, Informatique, Sciences exactes et technologie, Exact sciences and technology, Sciences appliquees, Applied sciences, Informatique; automatique theorique; systemes, Computer science; control theory; systems, Logiciel, Software, Organisation des mémoires. Traitement des données, Memory organisation. Data processing, Traitement des données. Listes et chaînes de caractères, Data processing. List processing. Character string processing, Intelligence artificielle, Artificial intelligence, Connexionnisme. Réseaux neuronaux, Connectionism. Neural networks, Algorithme Kohonen, Kohonen algorithm, Algoritmo Kohonen, Algorithme en ligne, Online algorithm, Algoritmo en línea, Analyse amas, Cluster analysis, Analisis cluster, Autoorganisation, Self organization, Autoorganización, Classification à vaste marge, Vector support machine, Máquina ejemplo soporte, Descente gradient, Gradient descent, Gradient bajada, Espace Hilbert, Hilbert space, Espacio Hilbert, Espace euclidien, Euclidean space, Espacio euclidiano, Espace métrique, Metric space, Espacio métrico, Géométrie euclidienne, Euclidean geometry, Geometría euclidiana, Isomorphisme, Isomorphism, Isomorfismo, Méthode noyau, Kernel method, Método núcleo, Métrique, Metric, Métrico, Prototype, Prototipo, Quantification vectorielle, Vector quantization, Cuantificación vectorial, Réseau neuronal, Neural network, Red neuronal, Kernel distances, LVQ, Online learning, Self-organizing maps, Support vector machines
Document Type:
Academic journal Article
File Description:
text
Language:
English
Author Affiliations:
University of Applied Sciences Mittweida, Computational Intelligence Group, Technikumplatz 17, 09648 Mittweida, Germany
ISSN:
0925-2312
Rights:
Copyright 2015 INIST-CNRS
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Notes:
Computer science; theoretical automation; systems
Accession Number:
edscal.28836734
Database:
PASCAL Archive

Further Information

Prototype based vector quantization is usually proceeded in the Euclidean data space. In the last years, also non-standard metrics became popular. For classification by support vector machines, Hilbert space representations, which are based on so-called kernel metrics, seem to be very successful. In this paper we show that gradient based learning in prototype-based vector quantization is possible by means of kernel metrics instead of the standard Euclidean distance. We will show that an appropriate handling requires differentiable universal kernels defining the feature space metric. This allows a prototype adaptation in the original data space but equipped with a metric determined by the kernel and, therefore, it is isomorphic to respective kernel Hilbert space. However, this approach avoids the Hilbert space representation as known for support vector machines. We give the mathematical justification for the isomorphism and demonstrate the abilities and the usefulness of this approach for several examples including both artificial and real world datasets.