Abstract
This paper presents a mathematical analysis of the occurrence of temporary minima during training of a single-output, two-layer neural network, with learning according to the back-propagation algorithm. A new vector decomposition method is introduced, which simplifies the mathematical analysis of learning of neural networks considerably. The analysis shows that temporary minima are inherent to multilayer networks learning. A number of numerical results illustrate the analytical conclusions.
| Original language | English |
|---|---|
| Pages (from-to) | 1387-1403 |
| Journal | Neural networks |
| Volume | 7 |
| Issue number | 9 |
| DOIs | |
| Publication status | Published - 1994 |
Keywords
- Temporary minimum
- Pattern classification
- Learning
- Neural networks
- Multilayer perceptron
- Back propagation
Fingerprint
Dive into the research topics of 'Learning behavior and temporary minima of two-layer neural networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver