An enhanced version of backpropagation uses a momentum term and flat
spot elimination. It is listed among the SNNS learning functions
as ` BackpropMomentum`.

The momentum term introduces the old weight change as a parameter for the computation of the new weight change. This avoids oscillation problems common with the regular backpropagation algorithm when the error surface has a very narrow minimum area. The new weight change is computed by

is a constant specifying the influence of the momentum.

The effect of these enhancements is that flat spots of the error surface are traversed relatively rapidly with a few big steps, while the step size is decreased as the surface gets rougher. This adaption of the step size increases learning speed significantly.

** Note** that the old weight change is lost every time the parameters
are modified, new patterns are loaded, or the network is modified.

Niels.Mache@informatik.uni-stuttgart.de

Tue Nov 28 10:30:44 MET 1995