New Features of SNNS 4.3

Version 4.3 of SNNS features the following improvements and extensions over the earlier version 4.2:

included patches avaiable from the SNNS-development project .

License changed to LGPL v2.

Fixed some bugs in the installation configuration files.

New Features of SNNS 4.2

Version 4.2 of SNNS features the following improvements and extensions over the earlier version 4.1:

greatly improved installation procedure

pattern remapping functions introduced to SNNS

class information in patterns introduced to SNNS

change to all batch algorithms: The learning rate is now divided by the number of patterns in the set. This allows for direct comparisons of learning rates and training of large pattern files with BP-Batch since it doesn't require ridiculous learning rates like 0.0000001 anymore.

Changes to Cascade-Correlation:
Several modifications can be used to achieve a net with a smaller depth or smaller Fan-In.
New activation functions ACT_GAUSS and ACT_SIN
The backpropagation algorithm of Cascade-Correlation is now present in an offline and a batch version.
The activations of the units could be cached. The result is a faster learning for nets with many units. On the other hand, the needed memory space will rise for large training patterns.
Changes in the 2D-display, the hidden units are displayed in layers, the candidate units are placed on the top of the net.
validation now possible
automatic deletion of candidate units at the end of training.

new meta learning algorithm TACOMA.

new learning algorithm BackpropChunk. It allows chunkwise updating of the weights as well as selective training of units on the basis of pattern class names.

new learning algorithm RPROP with weight decay.

algorithm ``Recurrent Cascade Correlation'' deleted from repository.

the options of adding noise to the weights with the JogWeights function improved im multiple ways.

improved plotting in the graph panel as well as printing option

when standard colormap is full, SNNS will now start with a privat map instead of aborting.

analyze tool now features a confusion matrix.

pruning panel now more ``SNNS-like''. You do not need to close the panel anymore before pruning a network.

Changes in batchman
batchman can now handle DLVQ training

new batchman command ``setActFunc'' allows the changing of unit activation functions from within the training script. Thanks to Thomas Rausch, University of Dresden, Germany.

batchman output now with ``#'' prefix. This enables direct processing by a lot of unix tools like gnuplot.

batchman now automatically converts function parameters to correct type instead of aborting.

jogWeights can now also be called from batchman

batchman catches some non-fatal signals (SIGINT, SIGTERM, ...) and sets the internal variable SIGNAL so that the script can react to them.

batchman features ResetNet function (e.g. for Jordan networks).

new tool ``linknets'' introduced to combine existing networks

new tools ``td_bignet'' and ``ff_bignet'' introduced for script-based generation of network files; Old tool bignet removed.

displays will be refreshed more often when using the graphical editor

weight and projection display with changed color scale. They now match the 2D-display scale.
pat_sel now can handle pattern files with multi-line comments

manpages now available for most of the SNNS programs.

the number of things stored in an xgui configuration file was greatly enhanced.

Extensive debugging:
batchman computes MSE now correctly from the number of (sub-) patterns.

RBFs receive now correct number of parameters.

spurious segmentation faults in the graphical editor tracked and eliminated.

segmentation fault when training on huge pattern files cleared.

various seg-faults under single operating systems tracked and cleared.

netperf now can test on networks that need multiple training parameters.

segmentaion faults when displaying 3D-Networks cleared.

correct default values for initialization functions in batchman.

the call ``TestNet()'' prohibited further training in batchman. Now everything works as expected.

segmentation fault in batchman when doing multiple string concats cleared and memory leak in string operations closed. Thanks to Walter Prins, University of Stellenbosch, South Africa.

the output of the validation error on the shell window was giving wrong values.

algorithm SCG now respects special units and handles them correctly.

the description of the learning function parameters in section 4.4 is finally ordered alphabetically.