Pruning Error Minimization in Least Squares Support Vector Machines

Bas J. de Kruif, Theo J.A. de Vries

    Research output: Contribution to journalArticleAcademicpeer-review

    187 Citations (Scopus)
    111 Downloads (Pure)

    Abstract

    The support vector machine (SVM) is a method for classification and for function approximation. This method commonly makes use of an /spl epsi/-insensitive cost function, meaning that errors smaller than /spl epsi/ remain unpunished. As an alternative, a least squares support vector machine (LSSVM) uses a quadratic cost function. When the LSSVM method is used for function approximation, a nonsparse solution is obtained. The sparseness is imposed by pruning, i.e., recursively solving the approximation problem and subsequently omitting data that has a small error in the previous pass. However, omitting data with a small approximation error in the previous pass does not reliably predict what the error will be after the sample has been omitted. In this paper, a procedure is introduced that selects from a data set the training sample that will introduce the smallest approximation error when it will be omitted. It is shown that this pruning scheme outperforms the standard one.
    Original languageEnglish
    Pages (from-to)696-702
    Number of pages7
    JournalIEEE transactions on neural networks
    Volume14
    Issue number3
    DOIs
    Publication statusPublished - 2003

    Fingerprint Dive into the research topics of 'Pruning Error Minimization in Least Squares Support Vector Machines'. Together they form a unique fingerprint.

  • Cite this