Treffer: A Parallel Mixture of SVMs for Very Large Scale Problems: A parallel mixture of SVMs for very large scale problems

Title:
A Parallel Mixture of SVMs for Very Large Scale Problems: A parallel mixture of SVMs for very large scale problems
Source:
Advances in Neural Information Processing Systems 14 ISBN: 9780262271738
Publisher Information:
MIT Press - Journals, 2002.
Publication Year:
2002
Document Type:
Fachzeitschrift Article<br />Part of book or chapter of book
File Description:
application/xml
Language:
English
ISSN:
1530-888X
0899-7667
DOI:
10.1162/089976602753633402
DOI:
10.7551/mitpress/1120.003.0086
Accession Number:
edsair.doi.dedup.....2de61c20f8d02572dbc0eb5bf9e92953
Database:
OpenAIRE

Weitere Informationen

Support vector machines (SVMs) are the state-of-the-art models for many classification problems, but they suffer from the complexity of their training algorithm, which is at least quadratic with respect to the number of examples. Hence, it is hopeless to try to solve real-life problems having more than a few hundred thousand examples with SVMs. This article proposes a new mixture of SVMs that can be easily implemented in parallel and where each SVM is trained on a small subset of the whole data set. Experiments on a large benchmark data set (Forest) yielded significant time improvement (time complexity appears empirically to locally grow linearly with the number of examples). In addition, and surprisingly, a significant improvement in generalization was observed.