An Optimal Basis for Feature Extraction With Support Vector Machine Classification Using The Radius-Margin Bound Conferences uri icon

  •  
  • Overview
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • A method is presented for deriving an optimal basis for features classified with a support vector machine. The method is based on minimizing the leave-one-out error which is approximated by the radius-margin bound. A gradient descent method provides a learning rule for the basis in an outer loop of an iteration. The inner loop performs support vector machine training and provides support vector coefficients on which the gradient descent depends. In this way, the derivation of a basis for feature extraction and the support vector machine are jointly optimized. The efficacy of the method is illustrated with examples from multi-dimensional synthetic data sets

publication date

  • May 2006