The least squares criterion, as used by the backpropagation learning rule in multi-layer feed forward neural networks, does not always yield a solution that is in accordance with the desired behaviour of the neural network. This is for example the case when differentiation between different types of errors is required and the costs of the error types must be taken into account. In this paper the application of other error measures, specifically matched to the application, is investigated. The error measures used are based on the average risk, a function that is a weighted combination of the probabilities on the different types of errors that may occur. Special attention is payed to applications where the input patterns are not independent, and the average risk does not depend on the output of a single input pattern, but on its neighbourhood, or context. The ideas are illustrated with pulse detection in a one dimensional signal.