SNNS is a joint effort of a number of people, computer science students, research assistants as well as faculty members at the University of Stuttgart, Institute for Parallel and Distributed High Performance Systems (IPVR), Stuttgart, Germany.
The project to develop an efficient and portable neural network simulator which later became SNNS was lead since 1989 by Dr. Andreas Zell, who designed the predecessor to the SNNS simulator and the SNNS simulator itself and acted as advisor for more than two dozen independent research and Master's thesis projects that made up the SNNS simulator and some of its applications. Over time the SNNS source grew to a total size of now 5.6MB in 190.000 lines of code. All the research has been done under the supervision of Prof. Dr. Andreas Reuter and Prof. Dr. Paul Levi. We are all grateful for their continuing support and for providing us with the necessary computer and network equipment.
The following persons were directly involved in the SNNS project. They are listed in the order in which they joined the SNNS team.
Design of the SNNS simulator, SNNS project team leader
[ZMS90], [ZMSK91b] [ZMSK91c], [ZMSK91a]
[Niels Mache] SNNS simulator kernel (really the heart of SNNS) [Mac90], parallel SNNS kernel on MasPar MP-1216.
[Tilman Sommer] original version of the graphical user interface XGUI with integrated network editor [Som89], PostScript printing.
[Ralf Hübner] SNNS simulator 3D graphical user interface [Hüb92], user interface development (version 2.0 to 3.0).
[Thomas Korb] SNNS network compiler and network description language Nessus [Kor89]
[Michael Vogt] Radial Basis Functions [Vog92]. Together with Günter Mamier implementation of Time Delay Networks. Definition of the new pattern format.
[Günter Mamier] SNNS visualization and analyzing tools [Mam92]. Implementation of the batch execution capability. Together with Michael Vogt implementation of the new pattern handling. Compilation and continuous update of the user manual. Maintenance of the ftp server. Bugfixes and installation of external contributions.
[Michael Schmalzl] SNNS network creation tool Bignet, implementation of Cascade Correlation, and printed character recognition with SNNS [Sch91a]
[Kai-Uwe Herrmann] ART models ART1, ART2, ARTMAP and modification of the BigNet tool [Her92].
[Artemis Hatzigeorgiou] Video documentation about the SNNS project, learning procedure Backpercolation 1.
[Dietmar Posselt] ANSI-C translation of SNNS.
[Sven Döring] ANSI-C translation of SNNS and source code maintenance. Implementation of distributed kernel for workstation clusters.
[Tobias Soyez] Jordan and Elman networks, implementation of the network analyzer [Soy93].
[Tobias Schreiner] Network pruning algorithms [Sch94]
[Bernward Kett] Redesign of C-code generator snns2c.
[Gianfranco Clemente] Help with the user manual
[Henri Bauknecht] Manager of the SNNS mailing list.
[Jens Wieland] Design and implementation of batchman.
We are proud of the fact that SNNS is experiencing growing support from people outside our university. There are many people who helped us by pointing out bugs or offering bug fixes, both to us and other users. Unfortunately they are to numerous to list here, so we restrict ourselves to those who have made a major contribution to the source code.
[Martin Riedmiller, University of Karlsruhe]
Implementation of RPROP in SNNS
[Martin Reczko, German Cancer Research Center (DKFZ)]
Implementation of Backpropagation Through Time (BPTT), BatchBackpropagation Through Time (BBPTT), and Quickprop Through Time (QPTT).
[Mark Seemann and Marcus Ritt, University of Tübingen]
Implementation of self organizing maps.
[Jamie DeCoster, Purdue University]
Implementation of auto-associative memory functions.
[Jochen Biedermann, University of Göttingen]
Help with the implementation of pruning Algorithms and non-contributing units
[Christian Wehrfritz, University of Erlangen]
Original implementation of the projection tool, implementation of the statistics computation and learning algorithm Pruned Cascade Correlation.
[Randolf Werner, University of Koblenz]
Support for NeXT systems
[Joachim Danz, University of Darmstadt]
Implementation of cross validation, simulated annealing and Monte Carlo learning algorithms.
[Michael Berthold, University of Karlsruhe]
Implementation of enhanced RBF algorithms.
[Bruno Orsier, University of Geneva]
Implementation of Scaled Conjugate Gradient learning.
[Till Brychcy, Technical University of Munich]
Suplied the code to keep only the important parameters in the control panel visible.
[Joydeep Ghosh, University of Texas, Austin]
Implenetation of WinSNNS, a MS-Windows front-end to SNNS batch execution on unix workstations.
[Thomas Ragg, University of Karlsruhe]
Implementation of Genetic algorithm tool Knete.
If you would like to contact the SNNS team via e-mail please write to Andreas Zell at `firstname.lastname@example.org'.
The SNNS simulator is a successor to an earlier neural network simulator called NetSim [ZKSB89], [KZ89] by A. Zell, T. Sommer, T. Korb and A. Bayer, which was itself influenced by the popular Rochester Connectionist Simulator RCS [GLML89].
In September 1991 the Stuttgart Neural Network Simulator SNNS was awarded the ``Deutscher Hochschul-Software-Preis 1991'' (German Federal Research Software Prize) by the German Federal Minister for Science and Education, Prof. Dr. Ortleb.