Title: A support vector machine formulation to PCA analysis and its kernel version
Authors: Suykens, Johan ×
Van Gestel, Tony
Vandewalle, Joos
De Moor, Bart #
Issue Date: Mar-2003
Publisher: Ieee-inst electrical electronics engineers inc
Series Title: IEEE Transactions on neural networks vol:14 issue:2 pages:447-450
Abstract: In this letter, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional. feature space and application of the kernel trick (Mercer theorem) kernel PCA is obtained as introduced by Scholkopf et al. While least squares support vector machine classifiers have a natural link with kernel Fisher discriminant analysis (minimizing the within class scatter around targets + 1 and -1), for PCA analysis one can take the interpretation of a one-class modeling problem with zero target value around which one maximizes the variance. The score variables are interpreted as error variables within the problem formulation. In this way primal-dual constrained optimization problem interpretations to linear and kernel PCA analysis are obtained in a similar style as for least square-support vector machine (LS-SVM) classifiers.
ISSN: 1045-9227
Publication status: published
KU Leuven publication type: IT
Appears in Collections:ESAT - STADIUS, Stadius Centre for Dynamical Systems, Signal Processing and Data Analytics
× corresponding author
# (joint) last author

Files in This Item:

There are no files associated with this item.

Request a copy


All items in Lirias are protected by copyright, with all rights reserved.

© Web of science