next up previous contents index
Next: Cascade-Correlation (CC) Up: Neural Network Models Previous: Backpropagation Through Time

The Cascade Correlation Algorithms


Two cascade correlation algorithms have been implemented in SNNS, Cascade-Correlation and recurrent Cascade-Correlation. Both learning algorithms have been developed by Scott Fahlman ([FL91], [HF91], [Fah91]). Strictly speaking the cascade architecture represents a kind of meta algorithm, in which usual learning algorithms like Backprop, Quickprop or Rprop are embedded. Cascade-Correlation is characterized as a constructive learning rule. It starts with a minimal network, consisting only of an input and an output layer. Minimizing the overall error of a net, it adds step by step new hidden units to the hidden layer.

Cascade-Correlation is a supervised learning architecture which builds a near minimal multi-layer network topology. The two advantages of this architecture are that there is no need for a user to worry about the topology of the network, and that Cascade-Correlation learns much faster than the usual learning algorithms.
Tue Nov 28 10:30:44 MET 1995