Training Neural Networks for minimum average risk with a special application to context dependent learning

    Research output: Contribution to journalArticleAcademicpeer-review

    173 Downloads (Pure)


    The least squares criterion, as used by the backpropagation learning rule in multi-layer feed forward neural networks, does not always yield a solution that is in accordance with the desired behaviour of the neural network. This is for example the case when differentiation between different types of errors is required and the costs of the error types must be taken into account. In this paper the application of other error measures, specifically matched to the application, is investigated. The error measures used are based on the average risk, a function that is a weighted combination of the probabilities on the different types of errors that may occur. Special attention is payed to applications where the input patterns are not independent, and the average risk does not depend on the output of a single input pattern, but on its neighbourhood, or context. The ideas are illustrated with pulse detection in a one dimensional signal.
    Original languageUndefined
    Pages (from-to)1221-1236
    Number of pages16
    JournalPattern recognition letters
    Issue number12
    Publication statusPublished - 1995


    • METIS-111738
    • IR-14688
    • SCS-Safety
    • EWI-14026

    Cite this