Limit this search to....

Least Squares Support Vector Machines
Contributor(s): Johan a. K. Suykens, Et Al (Author)
ISBN: 9812381511     ISBN-13: 9789812381514
Publisher: World Scientific Publishing Company
OUR PRICE:   $105.45  
Product Type: Hardcover
Published: November 2002
Qty:
Annotation: This book focuses on Least Squares Support Vector Machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing spareness and employing robust statistics.

The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nystrom sampling with active selection of support vectors. The methods are illustrated with several examples.

Additional Information
BISAC Categories:
- Computers | Neural Networks
- Computers | Intelligence (ai) & Semantics
- Computers | Networking - General
Dewey: 006
LCCN: 2002033063
Physical Information: 0.84" H x 6.52" W x 9.48" (1.23 lbs) 308 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
This book focuses on Least Squares Support Vector Machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing sparseness and employing robust statistics.The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nystr m sampling with active selection of support vectors. The methods are illustrated with several examples.