Simulation of Neural Networks

In most of the departments research in artificial neural networks the Stuttgart Neural Network Simulator (SNNS), originally developed by a team of the chair at the University of Stuttgart, is used.

SNNS is an efficient universal simulator of neural networks for Unix workstations and Unix PCs. It consists of a simulator kernel, written in ANSI C for efficiency and portability reasons, a graphical user interface under X11R6, a network compiler which is able to generate C-programms out of trained ANSI nets, a batch version of the simulator and various other analysis tools.



SNNS user interface

Currently the following learning algorithms are implemented in SNNS:

  1. Standard Backpropagation (BP)
  2. Enhanced Backpropagation with momentum term and flat spot elimination
  3. Batch Backpropagation
  4. Backpropagation with weight-decay
  5. Quickprop
  6. Resilient Propagation (Rprop)
  7. Backpercolation
  8. Counterpropagation (without neighborhood relation in the Kohonen-layer)
  9. Dynamic learning vector quantization (DLVQ)
  10. Backpropagation Through Time (BBPTT)
  11. Batch Backpropagation Through Time (BBPTT)
  12. Quickprop Through Time (QPTT)
  13. Cascade Correlation (CC) with embedded Backpropagation, Quickprop or Rprop
  14. Recurrent Cascade Correlation (RCC)
  15. Time-Delay-Networks (TDNN)
  16. Radial Basis Functions (RBF)
  17. Radial Basis Functions with Dynamic Decay Adjustment (RBF-DDA)
  18. Adaptive Resonance Theory 1 (ART1)
  19. Adaptive Resonance Theory 2 (ART2)
  20. ARTMAP Network
  21. Self-organizing maps (Kohonen-maps, SOM)
  22. Auto-associative networks with Hebbian learning or Delta-rule
  23. Jordan-networks
  24. Elman-networks and expanded hierarchic Elman-networks
  25. Monte-Carlo training
  26. Simulated Annealing

Several pruning algorithms are able to reduce the number of weights or neurons of a network in order to achieve a higher generalization performance by fewer free parameters:

  1. Magnitude Based Pruning (Mag)
  2. Optimal Brain Damage (OBD)
  3. Optimal Brain Surgeon (OBS)
  4. Skeletonization (Skel)
  5. Non-contributing Units
With these procedures both input neurons and hidden neurons can be reduced simultaneously.

Finally, it is now possible to optimize the topology of neural nets by means of genetic algorithms by using the tool ENZO of the University of Karlsruhe which is integrated in SNNS.

For latest news and software downloads please see our SNNS page.


Contact

Prof. Dr. Andreas Zell, Tel.: (07071) 29-76455, zell@informatik.uni-tuebingen.de