Home
Scholarly Works
An Optimal Basis for Feature Extraction With...
Conference

An Optimal Basis for Feature Extraction With Support Vector Machine Classification Using The Radius-Margin Bound

Abstract

A method is presented for deriving an optimal basis for features classified with a support vector machine. The method is based on minimizing the leave-one-out error which is approximated by the radius-margin bound. A gradient descent method provides a learning rule for the basis in an outer loop of an iteration. The inner loop performs support vector machine training and provides support vector coefficients on which the gradient descent depends. In this way, the derivation of a basis for feature extraction and the support vector machine are jointly optimized. The efficacy of the method is illustrated with examples from multi-dimensional synthetic data sets

Authors

Fortuna J; Capson D

Volume

5

Pagination

pp. V-V

Publication Date

May 1, 2006

DOI

10.1109/ICASSP.2006.1661338

Conference proceedings

2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings
View published work (Non-McMaster Users)

Contact the Experts team