New Features of SNNS 4.3

Version 4.3 of SNNS features the following improvements and extensions over the earlier version 4.2:

1.
included patches avaiable from the SNNS-development project .

2.
License changed to LGPL v2.

3.
Fixed some bugs in the installation configuration files.

New Features of SNNS 4.2

Version 4.2 of SNNS features the following improvements and extensions over the earlier version 4.1:

1.
greatly improved installation procedure

2.
pattern remapping functions introduced to SNNS

3.
class information in patterns introduced to SNNS

4.
change to all batch algorithms: The learning rate is now divided by the number of patterns in the set. This allows for direct comparisons of learning rates and training of large pattern files with BP-Batch since it doesn't require ridiculous learning rates like 0.0000001 anymore.

5.
Changes to Cascade-Correlation:
(a)
Several modifications can be used to achieve a net with a smaller depth or smaller Fan-In.
(b)
New activation functions ACT_GAUSS and ACT_SIN
(c)
The backpropagation algorithm of Cascade-Correlation is now present in an offline and a batch version.
(d)
The activations of the units could be cached. The result is a faster learning for nets with many units. On the other hand, the needed memory space will rise for large training patterns.
(e)
Changes in the 2D-display, the hidden units are displayed in layers, the candidate units are placed on the top of the net.
(f)
validation now possible
(g)
automatic deletion of candidate units at the end of training.

6.
new meta learning algorithm TACOMA.

7.
new learning algorithm BackpropChunk. It allows chunkwise updating of the weights as well as selective training of units on the basis of pattern class names.

8.
new learning algorithm RPROP with weight decay.

9.
algorithm ``Recurrent Cascade Correlation'' deleted from repository.

10.
the options of adding noise to the weights with the JogWeights function improved im multiple ways.

11.
improved plotting in the graph panel as well as printing option

12.
when standard colormap is full, SNNS will now start with a privat map instead of aborting.

13.
analyze tool now features a confusion matrix.

14.
pruning panel now more ``SNNS-like''. You do not need to close the panel anymore before pruning a network.

15.
Changes in batchman
(a)
batchman can now handle DLVQ training

(b)
new batchman command ``setActFunc'' allows the changing of unit activation functions from within the training script. Thanks to Thomas Rausch, University of Dresden, Germany.

(c)
batchman output now with ``#'' prefix. This enables direct processing by a lot of unix tools like gnuplot.

(d)
batchman now automatically converts function parameters to correct type instead of aborting.

(e)
jogWeights can now also be called from batchman

(f)
batchman catches some non-fatal signals (SIGINT, SIGTERM, ...) and sets the internal variable SIGNAL so that the script can react to them.

(g)
batchman features ResetNet function (e.g. for Jordan networks).

16.
new tool ``linknets'' introduced to combine existing networks

17.
new tools ``td_bignet'' and ``ff_bignet'' introduced for script-based generation of network files; Old tool bignet removed.

18.
displays will be refreshed more often when using the graphical editor

19.
weight and projection display with changed color scale. They now match the 2D-display scale.
20.
pat_sel now can handle pattern files with multi-line comments

21.
manpages now available for most of the SNNS programs.

22.
the number of things stored in an xgui configuration file was greatly enhanced.

23.
Extensive debugging:
(a)
batchman computes MSE now correctly from the number of (sub-) patterns.

(b)
RBFs receive now correct number of parameters.

(c)
spurious segmentation faults in the graphical editor tracked and eliminated.

(d)
segmentation fault when training on huge pattern files cleared.

(e)
various seg-faults under single operating systems tracked and cleared.

(f)
netperf now can test on networks that need multiple training parameters.

(g)
segmentaion faults when displaying 3D-Networks cleared.

(h)
correct default values for initialization functions in batchman.

(i)
the call ``TestNet()'' prohibited further training in batchman. Now everything works as expected.

(j)
segmentation fault in batchman when doing multiple string concats cleared and memory leak in string operations closed. Thanks to Walter Prins, University of Stellenbosch, South Africa.

(k)
the output of the validation error on the shell window was giving wrong values.

(l)
algorithm SCG now respects special units and handles them correctly.

(m)
the description of the learning function parameters in section 4.4 is finally ordered alphabetically.