Comparison of four support-vector based function approximators

Bas J. de Kruif, Theo J.A. de Vries

    Research output: Contribution to conferencePaper

    52 Downloads (Pure)

    Abstract

    One of the uses of the support vector machine (SVM), as introduced in V.N. Vapnik (2000), is as a function approximator. The SVM and approximators based on it, approximate a relation in data by applying interpolation between so-called support vectors, being a limited number of samples that have been selected from this data. Several support-vector based function approximators are compared in this research. The comparison focuses on the following subjects: i) how many support vectors are involved in achieving a certain approximation accuracy, ii) how well are noisy training samples handled, and iii) how is ambiguous training data dealt with. The comparison shows that the so-called key sample machine (KSM) outperforms the other schemes, specifically on aspects i and ii. The distinctive features that explain this, are the quadratic cost function and using all the training data to train the limited parameters.
    Original languageEnglish
    Pages549-554
    DOIs
    Publication statusPublished - 2004
    Event2004 IEEE International Joint Conference on Neural Networks, IJCNN 2004 - Budapest, Hungary
    Duration: 25 Jul 200429 Jul 2004

    Conference

    Conference2004 IEEE International Joint Conference on Neural Networks, IJCNN 2004
    Abbreviated titleIJCNN
    CountryHungary
    CityBudapest
    Period25/07/0429/07/04

    Fingerprint Dive into the research topics of 'Comparison of four support-vector based function approximators'. Together they form a unique fingerprint.

  • Cite this

    de Kruif, B. J., & de Vries, T. J. A. (2004). Comparison of four support-vector based function approximators. 549-554. Paper presented at 2004 IEEE International Joint Conference on Neural Networks, IJCNN 2004, Budapest, Hungary. https://doi.org/10.1109/IJCNN.2004.1379968