Treffer: Observations on boosting feature selection

Title:
Observations on boosting feature selection
Source:
MCS 2005 : multiple classifier systems (Seaside CA, 13-15 June 2005)Lecture notes in computer science. :32-41
Publisher Information:
Berlin: Springer, 2005.
Publication Year:
2005
Physical Description:
print, 19 ref
Original Material:
INIST-CNRS
Document Type:
Konferenz Conference Paper
File Description:
text
Language:
English
Author Affiliations:
ECE, School of EPS, Heriot-Watt University, Edinburgh, EH14 4AS, United Kingdom
ISSN:
0302-9743
Rights:
Copyright 2005 INIST-CNRS
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Notes:
Computer science; theoretical automation; systems
Accession Number:
edscal.16923049
Database:
PASCAL Archive

Weitere Informationen

This paper presents a study of the Boosting Feature Selection (BFS) algorithm [1], a method which incorporates feature selection into Adaboost. Such an algorithm is interesting as it combines the methods studied by Boosting and ensemble feature selection researchers. Observations are made on generalisation, weighted error and error diversity to compare the algorithms performance to Adaboost while using a nearest mean base learner. Ensemble feature prominence is proposed as a stop criterion for ensemble construction. Its quality assessed using the former performance measures. BFS is found to compete with Adaboost in terms of performance, despite the reduced feature description for each base classifer. This is explained using weighted error and error diversity. Results show the proposed stop criterion to be useful for trading ensemble performance and complexity.