Weight Decay was introduced by P. Werbos ([Wer88]). It decreases the weights of the links while training them with backpropagation. In addition to each update of a weight by backpropagation, the weight is decreased by a part d of its old value. The resulting formula is
The effect is similar to the pruning algorithms (see chapter ). Weights are driven to zero unless reinforced by backpropagation. For further information, see [Sch94].