Advanced search

A fast and robust learning algorithm for feedforward neural networks

(1991) NEURAL NETWORKS. 4(3). p.361-369
Author
Organization
Abstract
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer perceptrons. However, it has some important drawbacks: long training times and sensitivity to the presence of local minima. Another problem is the network topology; the exact number of units in a particular hidden layer, as well as the number of hidden layers need to be known in advance. A lot of time is often spent in finding the optimal topology. In this article, we consider multilayer networks with one hidden layer of Gaussian units and an output layer of conventional units. We show that for this kind of networks, it is possible to perform a fast dimensionality analysis, by analyzing only a small fraction of the input patterns. Moreover, as a result of this approach, it is possible to initialize the weights of the network before starting the back propagation training. Several classification problems are taken as examples.
Keywords
GAUSSIAN HIDDEN UNITS, FEEDFORWARD NEURAL NETWORKS, CONVENTIONAL OUTPUT UNITS, FAST DIMENSIONALITY ANALYSIS, MODIFIED K-MEANS CLUSTERING, OPTIMIZATION, INITIALIZATION OF THE WEIGHTS, BROAD PHONETIC CLASSIFICATION

Citation

Please use this url to cite or link to this publication:

Chicago
Weymaere, Nico, and Jean-Pierre Martens. 1991. “A Fast and Robust Learning Algorithm for Feedforward Neural Networks.” Neural Networks 4 (3): 361–369.
APA
Weymaere, N., & Martens, J.-P. (1991). A fast and robust learning algorithm for feedforward neural networks. NEURAL NETWORKS, 4(3), 361–369.
Vancouver
1.
Weymaere N, Martens J-P. A fast and robust learning algorithm for feedforward neural networks. NEURAL NETWORKS. 1991;4(3):361–9.
MLA
Weymaere, Nico, and Jean-Pierre Martens. “A Fast and Robust Learning Algorithm for Feedforward Neural Networks.” NEURAL NETWORKS 4.3 (1991): 361–369. Print.
@article{208054,
  abstract     = {The back propagation algorithm caused a tremendous breakthrough in the application of multilayer perceptrons. However, it has some important drawbacks: long training times and sensitivity to the presence of local minima. Another problem is the network topology; the exact number of units in a particular hidden layer, as well as the number of hidden layers need to be known in advance. A lot of time is often spent in finding the optimal topology. In this article, we consider multilayer networks with one hidden layer of Gaussian units and an output layer of conventional units. We show that for this kind of networks, it is possible to perform a fast dimensionality analysis, by analyzing only a small fraction of the input patterns. Moreover, as a result of this approach, it is possible to initialize the weights of the network before starting the back propagation training. Several classification problems are taken as examples.},
  author       = {Weymaere, Nico and Martens, Jean-Pierre},
  issn         = {0893-6080},
  journal      = {NEURAL NETWORKS},
  language     = {eng},
  number       = {3},
  pages        = {361--369},
  title        = {A fast and robust learning algorithm for feedforward neural networks},
  url          = {http://dx.doi.org/10.1016/0893-6080(91)90072-D},
  volume       = {4},
  year         = {1991},
}

Altmetric
View in Altmetric