Archive-name: ai-faq/neural-nets/part5
Last-modified: 1998-02-24
URL: ftp://ftp.sas.com/pub/neural/FAQ5.html
Maintainer: saswss@unx.sas.com (Warren S. Sarle)
Neural Network FAQ, part 5 of 7: Free Software The copyright for the description of each product is held by the producer or distributor of the product or whoever it was who supplied the description for the FAQ, who by submitting it for the FAQ gives permission for the description to be reproduced as part of the FAQ in any of the ways specified in part 1 of the FAQ.

This is part 5 (of 7) of a monthly posting to the Usenet newsgroup comp.ai.neural-nets. See the part 1 of this posting for full information what it is all about.

========== Questions ==========

Part 1: Introduction
Part 2: Learning
Part 3: Generalization
Part 4: Books, data, etc.
Part 5: Free software
    Freeware and shareware packages for NN simulation?

Part 6: Commercial software
Part 7: Hardware and miscellaneous
------------------------------------------------------------------------

Subject: Freeware and shareware packages for NN simulation?

Since the FAQ maintainer works for a software company, he does not recommend or evaluate software in the FAQ. The descriptions below are provided by the developers or distributors of the software.

Note for future submissions: Please restrict software descriptions to a maximum of 60 lines of 72 characters, in either plain-text format or, preferably, HTML format. If you include the standard header (name, company, address, etc.), you need not count the header in the 60 line maximum. Please confine your HTML to features that are supported by most browsers, especially NCSA Mosaic 2.0; avoid tables, for example--use <pre> instead. Try to make the descriptions objective, and avoid making implicit or explicit assertions about competing products, such as "Our product is the *only* one that does so-and-so" or "Our innovative product trains bigger nets faster." The FAQ maintainer reserves the right to remove excessive marketing hype and to edit submissions to conform to size requirements; if he is in a good mood, he may also correct spelling and punctuation.

The following simulators are described below:

  1. Rochester Connectionist Simulator
  2. UCLA-SFINX
  3. NeurDS
  4. PlaNet (formerly known as SunNet)
  5. GENESIS
  6. Mactivation
  7. Cascade Correlation Simulator
  8. Quickprop
  9. DartNet
  10. SNNS
  11. Aspirin/MIGRAINES
  12. Adaptive Logic Network Educational Kit
  13. PDP++
  14. Uts (Xerion, the sequel)
  15. Neocognitron simulator
  16. Multi-Module Neural Computing Environment (MUME)
  17. LVQ_PAK, SOM_PAK
  18. Nevada Backpropagation (NevProp)
  19. Fuzzy ARTmap
  20. PYGMALION
  21. Basis-of-AI-NN Software
  22. Matrix Backpropagation
  23. WinNN
  24. BIOSIM
  25. The Brain
  26. FuNeGen
  27. NeuDL -- Neural-Network Description Language
  28. NeoC Explorer
  29. AINET
  30. DemoGNG
  31. PMNEURO 1.0a
  32. nn/xnn
  33. NNDT
  34. Trajan 2.1 Shareware
  35. Neural Networks at your Fingertips
  36. NNFit
  37. Nenet v1.0
  38. Machine Consciousness Toolbox
  39. NICO Toolkit (speech recognition)
See also http://www.emsl.pnl.gov:2080/docs/cie/neural/systems/shareware.html

  1. Rochester Connectionist Simulator

    A quite versatile simulator program for arbitrary types of neural nets. Comes with a backprop package and a X11/Sunview interface. Available via anonymous FTP from ftp.cs.rochester.edu in directory pub/packages/simulator as the files README (8 KB), and rcs_v4.2.tar.Z (2.9 MB)

  2. UCLA-SFINX

       ftp retina.cs.ucla.edu [131.179.16.6];
       Login name: sfinxftp;  Password: joshua;
       directory: pub;
       files : README; sfinx_v2.0.tar.Z;
       Email info request : sfinx@retina.cs.ucla.edu 
    

  3. NeurDS

    simulator for DEC systems supporting VT100 terminal. available for anonymous ftp from gatekeeper.dec.com [16.1.0.2] in directory: pub/DEC as the file NeurDS031.tar.Z (111 Kb)

  4. PlaNet5.7 (formerly known as SunNet)

    A popular connectionist simulator with versions to run under X Windows, and non-graphics terminals created by Yoshiro Miyata (Chukyo Univ., Japan). 60-page User's Guide in Postscript. Send any questions to miyata@sccs.chukyo-u.ac.jp Available for anonymous ftp from ftp.ira.uka.de as /pub/neuron/PlaNet5.7.tar.Z (800 kb) or from boulder.colorado.edu [128.138.240.1] as /pub/generic-sources/PlaNet5.7.tar.Z

  5. GENESIS

    GENESIS 2.0 (GEneral NEural SImulation System) is a general purpose simulation platform which was developed to support the simulation of neural systems ranging from complex models of single neurons to simulations of large networks made up of more abstract neuronal components. Most current GENESIS applications involve realistic simulations of biological neural systems. Although the software can also model more abstract networks, other simulators are more suitable for backpropagation and similar connectionist modeling. Runs on most Unix platforms. Graphical front end XODUS. Parallel version for networks of workstations, symmetric multiprocessors, and MPPs also available. Available by ftp from ftp://genesis.bbb.caltech.edu/pub/genesis. Further information via WWW at http://www.bbb.caltech.edu/GENESIS/.

  6. Mactivation

    A neural network simulator for the Apple Macintosh. Available for ftp from ftp.cs.colorado.edu [128.138.243.151] as /pub/cs/misc/Mactivation-3.3.sea.hqx

  7. Cascade Correlation Simulator

    A simulator for Scott Fahlman's Cascade Correlation algorithm. Available for ftp from ftp.cs.cmu.edu in directory /afs/cs/project/connect/code/supported as the file cascor-v1.2.shar (223 KB) There is also a version of recurrent cascade correlation in the same directory in file rcc1.c (108 KB).

  8. Quickprop

    A variation of the back-propagation algorithm developed by Scott Fahlman. A simulator is available in the same directory as the cascade correlation simulator above in file nevprop1.16.shar (137 KB)
    (There is also an obsolete simulator called quickprop1.c (21 KB) in the same directory, but it has been superseeded by NevProp. See also the description of NevProp below.)

  9. DartNet

    DartNet is a Macintosh-based backpropagation simulator, developed at Dartmouth by Jamshed Bharucha and Sean Nolan as a pedagogical tool. It makes use of the Mac's graphical interface, and provides a number of tools for building, editing, training, testing and examining networks. This program is available by anonymous ftp from ftp.dartmouth.edu as /pub/mac/dartnet.sit.hqx (124 KB).

  10. SNNS 4.1

    "Stuttgarter Neural Network Simulator" from the University of Tuebingen, Germany (formerly from the University of Stuttgart): a simulator for many types of nets with X11 interface: Graphical 2D and 3D topology editor/visualizer, training visualisation, multiple pattern set handling etc.

    Currently supports backpropagation (vanilla, online, with momentum term and flat spot elimination, batch, time delay), counterpropagation, quickprop, backpercolation 1, generalized radial basis functions (RBF), RProp, ART1, ART2, ARTMAP, Cascade Correlation, Recurrent Cascade Correlation, Dynamic LVQ, Backpropagation through time (for recurrent networks), batch backpropagation through time (for recurrent networks), Quickpropagation through time (for recurrent networks), Hopfield networks, Jordan and Elman networks, autoassociative memory, self-organizing maps, time-delay networks (TDNN), RBF_DDA, simulated annealing, Monte Carlo, Pruned Cascade-Correlation, Optimal Brain Damage, Optimal Brain Surgeon, Skeletonization, and is user-extendable (user-defined activation functions, output functions, site functions, learning procedures). C code generator snns2c.

    Works on SunOS, Solaris, IRIX, Ultrix, OSF, AIX, HP/UX, NextStep, Linux, and Windows 95/NT. Distributed kernel can spread one learning run over a workstation cluster.

    SNNS web page: http://www-ra.informatik.uni-tuebingen.de/SNNS
    Ftp server: ftp://ftp.informatik.uni-tuebingen.de/pub/SNNS

    Mailing list: http://www-ra.informatik.uni-tuebingen.de/SNNS/about-ml.html

  11. Aspirin/MIGRAINES

    Aspirin/MIGRAINES 6.0 consists of a code generator that builds neural network simulations by reading a network description (written in a language called "Aspirin") and generates a C simulation. An interface (called "MIGRAINES") is provided to export data from the neural network to visualization tools. The system has been ported to a large number of platforms. The goal of Aspirin is to provide a common extendible front-end language and parser for different network paradigms. The MIGRAINES interface is a terminal based interface that allows you to open Unix pipes to data in the neural network. Users can display the data using either public or commercial graphics/analysis tools. Example filters are included that convert data exported through MIGRAINES to formats readable by Gnuplot 3.0, Matlab, Mathematica, and xgobi.

    The software is available from two FTP sites: from CMU's simulator collection on pt.cs.cmu.edu [128.2.254.155] in /afs/cs/project/connect/code/unsupported/am6.tar.Z and from UCLA's cognitive science machine ftp.cognet.ucla.edu [128.97.50.19] in /pub/alexis/am6.tar.Z (2 MB).

  12. Adaptive Logic Network Educational Kit (for Windows)

    The Atree 3.0 Educational Kit (EK) serves to develop simple applications using adaptive Logic Networks (ALNs). In an ALN, logic functions AND and OR make up all hidden layers but the first, which uses simple perceptrons. Though this net can't compute real-valued outputs, since its outputs are strictly boolean, it can easily and naturally represent real valued functions by giving a 0 above the function's graph and a 1 otherwise. This approach is extremely useful, since it allows the user to impose constraints on the functions to be learned (monotonicity, bounds on slopes, convexity,...). Very rapid computation of functions is done by an ALN decision tree at whose leaves are small expressions of minimum and maximum operations acting on linear functions.

    Two simple languages describe ALNs and the steps of training an ALN. Execution software for ALN decision trees resulting from training is provided in C source form for experimenters. EK and a "120-page" User's Guide are obtained by anonymous ftp from ftp.cs.ualberta.ca in directory /pub/atree/atree3/. Get the file atree3ek.exe (~900K) .

    The above User's Guide with an introduction to basic ALN theory is available on WWW at http://www.cs.ualberta.ca/~arms/guide/ch0.htm . This Educational Kit software is the same as the commercial Atree 3.0 program except that it allows only two input variables and is licensed for educational uses only. A built-in 2D and 3D plotting capability is useful to help the user understand how ALNs work.

  13. PDP++

    The PDP++ software is a new neural-network simulation system written in C++. It represents the next generation of the PDP software released with the McClelland and Rumelhart "Explorations in Parallel Distributed Processing Handbook", MIT Press, 1987. It is easy enough for novice users, but very powerful and flexible for research use.
    The current version is 1.0, our first non-beta release. It has been extensively tested and should be completely usable. Works on Unix with X-Windows.

    Features: Full GUI (InterViews), realtime network viewer, data viewer, extendable object-oriented design, CSS scripting language with source-level debugger, GUI macro recording.

    Algorithms: Feedforward and several recurrent BP, Boltzmann machine, Hopfield, Mean-field, Interactive activation and competition, continuous stochastic networks.

    The software can be obtained by anonymous ftp from ftp://cnbc.cmu.edu/pub/pdp++/ and from ftp://unix.hensa.ac.uk/mirrors/pdp++/.

    For more information, see our WWW page at http://www.cnbc.cmu.edu/PDP++/PDP++.html.
    There is a 250 page (printed) manual and an HTML version available on-line at the above address.

  14. Uts (Xerion, the sequel)

    Uts is a portable artificial neural network simulator written on top of the Tool Control Language (Tcl) and the Tk UI toolkit. As result, the user interface is readily modifiable and it is possible to simultaneously use the graphical user interface and visualization tools and use scripts written in Tcl. Uts itself implements only the connectionist paradigm of linked units in Tcl and the basic elements of the graphical user interface. To make a ready-to-use package, there exist modules which use Uts to do back-propagation (tkbp) and mixed em gaussian optimization (tkmxm). Uts is available in ftp.cs.toronto.edu in directory /pub/xerion.

  15. Neocognitron simulator

    The simulator is written in C and comes with a list of references which are necessary to read to understand the specifics of the implementation. The unsupervised version is coded without (!) C-cell inhibition. Available for anonymous ftp from unix.hensa.ac.uk [129.12.21.7] in /pub/neocognitron.tar.Z (130 kB).

  16. Multi-Module Neural Computing Environment (MUME)

    MUME is a simulation environment for multi-modules neural computing. It provides an object oriented facility for the simulation and training of multiple nets with various architectures and learning algorithms. MUME includes a library of network architectures including feedforward, simple recurrent, and continuously running recurrent neural networks. Each architecture is supported by a variety of learning algorithms. MUME can be used for large scale neural network simulations as it provides support for learning in multi-net environments. It also provide pre- and post-processing facilities.

    The modules are provided in a library. Several "front-ends" or clients are also available. X-Window support by editor/visualization tool Xmume. MUME can be used to include non-neural computing modules (decision trees, ...) in applications. MUME is available for educational institutions by anonymous ftp on mickey.sedal.su.oz.au [129.78.24.170] after signing and sending a licence: /pub/license.ps (67 kb).

    Contact:
    Marwan Jabri, SEDAL, Sydney University Electrical Engineering,
    NSW 2006 Australia, marwan@sedal.su.oz.au

  17. LVQ_PAK, SOM_PAK

    These are packages for Learning Vector Quantization and Self-Organizing Maps, respectively. They have been built by the LVQ/SOM Programming Team of the Helsinki University of Technology, Laboratory of Computer and Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND There are versions for Unix and MS-DOS available from http://nucleus.hut.fi/nnrc/nnrc-programs.html

  18. Nevada Backpropagation (NevProp)

      NevProp is a free, easy-to-use feedforward backpropagation
      (multilayer perceptron) program.  It uses an interactive
      character-based interface, and is distributed as C source code that
      should compile and run on most platforms. (Precompiled executables are
      available for Macintosh and DOS.)  The original version was Quickprop
      1.0 by Scott Fahlman, as translated from Common Lisp by Terry Regier.
      We added early-stopped training based on a held-out subset of data, c
      index (ROC curve area) calculation, the ability to force gradient
      descent (per-epoch or per-pattern), and additional options.
      FEATURES (NevProp version 1.16):
       UNLIMITED (except by machine memory) number of input PATTERNS;
       UNLIMITED number of input, hidden, and output UNITS;
       Arbitrary CONNECTIONS among the various layers' units;
       Clock-time or user-specified RANDOM SEED for initial random weights;
       Choice of regular GRADIENT DESCENT or QUICKPROP;
       Choice of PER-EPOCH or PER-PATTERN (stochastic) weight updating;
       GENERALIZATION to a test dataset;
       AUTOMATICALLY STOPPED TRAINING based on generalization;
       RETENTION of best-generalizing weights and predictions;
       Simple but useful GRAPHIC display to show smoothness of generalization;
       SAVING of results to a file while working interactively;
       SAVING of weights file and reloading for continued training;
       PREDICTION-only on datasets by applying an existing weights file;
       In addition to RMS error, the concordance, or c index is displayed.
       The c index (area under the ROC curve) shows the correctness of the
       RELATIVE ordering of predictions AMONG the cases; ie, it is a
       measure of discriminative power of the model.
       AVAILABILITY: See:
       ftp://ftp.scs.unr.edu/pub/cbmr/nevpropdir/index.html
       Version 2 has some new features:
       more flexible file formatting (including access to external data files;
       option to prerandomize data order; randomized stochastic gradient descent;
       option to rescale predictor (input) variables); linear output units as an
       alternative to sigmoidal units for use with continuous-valued dependent
       variables (output targets); cross-entropy (maximum likelihood) criterion
       function as an alternative to square error for use with categorical
       dependent variables (classification/symbolic/nominal targets); and
       interactive interrupt to change settings on-the-fly.
       Version 4 will be released in 1998.
       Limited support is available from Phil Goodman (goodman@unr.edu),
       University of Nevada Center for Biomedical Research.
    

  19. Fuzzy ARTmap

    This is just a small example program. Available for anonymous ftp from park.bu.edu [128.176.121.56] ftp://cns-ftp.bu.edu/pub/fuzzy-artmap.tar.Z (44 kB).

  20. PYGMALION

    This is a prototype that stems from an ESPRIT project. It implements back-propagation, self organising map, and Hopfield nets. Avaliable for ftp from ftp.funet.fi [128.214.248.6] as /pub/sci/neural/sims/pygmalion.tar.Z (1534 kb). (Original site is imag.imag.fr: archive/pygmalion/pygmalion.tar.Z).

  21. Basis-of-AI-NN Software

    Non-GUI DOS and UNIX source code, DOS binaries and examples are available in the following different program sets and the backprop package has a Windows 3.x binary and a Unix/Tcl/Tk version:
       [backprop, quickprop, delta-bar-delta, recurrent networks],
       [simple clustering, k-nearest neighbor, LVQ1, DSM],
       [Hopfield, Boltzman, interactive activation network],
       [interactive activation network],
       [feedforward counterpropagation],
       [ART I],
       [a simple BAM] and
       [the linear pattern classifier]
       
    For details see: Basis of AI NN software at http://www.mcs.com/~drt/svbp.html .

    An improved professional version of backprop is also available, $30 for regular people, $200 for businesses and governmental agencies. See: Basis of AI Professional Backprop at http://www.mcs.com/~drt/probp.html .

    Questions to: Don Tveter, drt@mcs.com

  22. Matrix Backpropagation

    MBP (Matrix Back Propagation) is a very efficient implementation of the back-propagation algorithm for current-generation workstations. The algorithm includes a per-epoch adaptive technique for gradient descent. All the computations are done through matrix multiplications and make use of highly optimized C code. The goal is to reach almost peak-performances on RISCs with superscalar capabilities and fast caches. On some machines (and with large networks) a 30-40x speed-up can be measured with respect to conventional implementations. The software is available by anonymous ftp from risc6000.dibe.unige.it [130.251.89.154] as /pub/MBPv1.1.tar.Z (Unix version), /pub/MBPv11.zip.Z (MS-DOS version), /pub/mpbv11.ps (Documentation). For more information, contact Davide Anguita (anguita@dibe.unige.it).

  23. WinNN

    WinNN is a shareware Neural Networks (NN) package for windows 3.1. WinNN incorporates a very user friendly interface with a powerful computational engine. WinNN is intended to be used as a tool for beginners and more advanced neural networks users, it provides an alternative to using more expensive and hard to use packages. WinNN can implement feed forward multi-layered NN and uses a modified fast back-propagation for training. Extensive on line help. Has various neuron functions. Allows on the fly testing of the network performance and generalization. All training parameters can be easily modified while WinNN is training. Results can be saved on disk or copied to the clipboard. Supports plotting of the outputs and weight distribution. Available for ftp from ftp.cc.monash.edu.au as /pub/win3/programr/winnn97.zip (747 kB).

  24. BIOSIM

    BIOSIM is a biologically oriented neural network simulator. Public domain, runs on Unix (less powerful PC-version is available, too), easy to install, bilingual (german and english), has a GUI (Graphical User Interface), designed for research and teaching, provides online help facilities, offers controlling interfaces, batch version is available, a DEMO is provided.

    REQUIREMENTS (Unix version): X11 Rel. 3 and above, Motif Rel 1.0 and above, 12 MB of physical memory, recommended are 24 MB and more, 20 MB disc space. REQUIREMENTS (PC version): PC-compatible with MS Windows 3.0 and above, 4 MB of physical memory, recommended are 8 MB and more, 1 MB disc space.

    Four neuron models are implemented in BIOSIM: a simple model only switching ion channels on and off, the original Hodgkin-Huxley model, the SWIM model (a modified HH model) and the Golowasch-Buchholz model. Dendrites consist of a chain of segments without bifurcation. A neural network can be created by using the interactive network editor which is part of BIOSIM. Parameters can be changed via context sensitive menus and the results of the simulation can be visualized in observation windows for neurons and synapses. Stochastic processes such as noise can be included. In addition, biologically orientied learning and forgetting processes are modeled, e.g. sensitization, habituation, conditioning, hebbian learning and competitive learning. Three synaptic types are predefined (an excitatatory synapse type, an inhibitory synapse type and an electrical synapse). Additional synaptic types can be created interactively as desired.

    Available for ftp from ftp.uni-kl.de in directory /pub/bio/neurobio: Get /pub/bio/neurobio/biosim.readme (2 kb) and /pub/bio/neurobio/biosim.tar.Z (2.6 MB) for the Unix version or /pub/bio/neurobio/biosimpc.readme (2 kb) and /pub/bio/neurobio/biosimpc.zip (150 kb) for the PC version.

    Contact:
    Stefan Bergdoll
    Department of Software Engineering (ZXA/US)
    BASF Inc.
    D-67056 Ludwigshafen; Germany
    bergdoll@zxa.basf-ag.de phone 0621-60-21372 fax 0621-60-43735

  25. The Brain

    The Brain is an advanced neural network simulator for PCs that is simple enough to be used by non-technical people, yet sophisticated enough for serious research work. It is based upon the backpropagation learning algorithm. Three sample networks are included. The documentation included provides you with an introduction and overview of the concepts and applications of neural networks as well as outlining the features and capabilities of The Brain.

    The Brain requires 512K memory and MS-DOS or PC-DOS version 3.20 or later (versions for other OS's and machines are available). A 386 (with maths coprocessor) or higher is recommended for serious use of The Brain. Shareware payment required.

    Demo version is restricted to number of units the network can handle due to memory contraints on PC's. Registered version allows use of extra memory.

    External documentation included: 39Kb, 20 Pages.
    Source included: No (Source comes with registration).
    Available via anonymous ftp from ftp.tu-clausthal.de as /pub/msdos/science/brain12.zip (78 kb) and from ftp.technion.ac.il as /pub/contrib/dos/brain12.zip (78 kb)

    Contact:
    David Perkovic
    DP Computing
    PO Box 712
    Noarlunga Center SA 5168
    Australia
    Email: dip@mod.dsto.gov.au (preferred) or dpc@mep.com or perkovic@cleese.apana.org.au

  26. FuNeGen 1.0

    FuNeGen is a MLP based software program to generate fuzzy rule based classifiers. A limited version (maximum of 7 inputs and 3 membership functions for each input) for PCs is available for anonymous ftp from obelix.microelectronic.e-technik.th-darmstadt.de in directory /pub/neurofuzzy. For further information see the file read.me. Contact: Saman K. Halgamuge

  27. NeuDL -- Neural-Network Description Language

    NeuDL is a description language for the design, training, and operation of neural networks. It is currently limited to the backpropagation neural-network model; however, it offers a great deal of flexibility. For example, the user can explicitly specify the connections between nodes and can create or destroy connections dynamically as training progresses. NeuDL is an interpreted language resembling C or C++. It also has instructions dealing with training/testing set manipulation as well as neural network operation. A NeuDL program can be run in interpreted mode or it can be automatically translated into C++ which can be compiled and then executed. The NeuDL interpreter is written in C++ and can be easly extended with new instructions.

    NeuDL is available from the anonymous ftp site at The University of Alabama: cs.ua.edu (130.160.44.1) in the file /pub/neudl/NeuDLver021.tar. The tarred file contains the interpreter source code (in C++) a user manual, a paper about NeuDL, and about 25 sample NeuDL programs. A document demonstrating NeuDL's capabilities is also available from the ftp site: /pub/neudl/NeuDL/demo.doc /pub/neudl/demo.doc. For more information contact the author: Joey Rogers (jrogers@buster.eng.ua.edu).

  28. NeoC Explorer (Pattern Maker included)

    The NeoC software is an implementation of Fukushima's Neocognitron neural network. Its purpose is to test the model and to facilitate interactivity for the experiments. Some substantial features: GUI, explorer and tester operation modes, recognition statistics, performance analysis, elements displaying, easy net construction. PLUS, a pattern maker utility for testing ANN: GUI, text file output, transformations. Available for anonymous FTP from OAK.Oakland.Edu (141.210.10.117) as /SimTel/msdos/neurlnet/neocog10.zip (193 kB, DOS version)

  29. AINET

    AINET is a probabilistic neural network application which runs on Windows 95/NT. It was designed specifically to facilitate the modeling task in all neural network problems. It is lightning fast and can be used in conjunction with many different programming languages. It does not require iterative learning, has no limits in variables (input and output neurons), no limits in sample size. It is not sensitive toward noise in the data. The database can be changed dynamically. It provides a way to estimate the rate of error in your prediction. It has a graphical spreadsheet-like user interface. The AINET manual (more than 100 pages) is divided into: "User's Guide", "Basics About Modeling with the AINET", "Examples", "The AINET DLL library" and "Appendix" where the theoretical background is revealed. You can get a full working copy from: http://www.ainet-sp.si/

  30. DemoGNG

    This simulator is written in Java and should therefore run without compilation on all platforms where a Java interpreter (or a browser with Java support) is available. It implements the following algorithms and neural network models: DemoGNG is distributed under the GNU General Public License. It allows to experiment with the different methods using various probability distributions. All model parameters can be set interactively on the graphical user interface. A teach modus is provided to observe the models in "slow-motion" if so desired. It is currently not possible to experiment with user-provided data, so the simulator is useful basically for demonstration and teaching purposes and as a sample implementation of the above algorithms.

    DemoGNG can be accessed most easily at http://www.neuroinformatik.ruhr-uni-bochum.de/ in the file /ini/VDM/research/gsn/DemoGNG/GNG.html where it is embedded as Java applet into a Web page and is downloaded for immediate execution when you visit this page. An accompanying paper entitled "Some competitive learning methods" describes the implemented models in detail and is available in html at the same server in the directory ini/VDM/research/gsn/JavaPaper/.

    It is also possible to download the complete source code and a Postscript version of the paper via anonymous ftp from ftp.neuroinformatik.ruhr-uni-bochum [134.147.176.16] in directory /pub/software/NN/DemoGNG/. The software is in the file DemoGNG-1.00.tar.gz (193 KB) and the paper in the file sclm.ps.gz (89 KB). There is also a README file (9 KB). Please send any comments and questions to demogng@neuroinformatik.ruhr-uni-bochum.de which will reach Hartmut Loos who has written DemoGNG as well as Bernd Fritzke, the author of the accompanying paper.

  31. PMNEURO 1.0a

    PMNEURO 1.0a is available at:
    
    ftp://ftp.uni-stuttgart.de/pub/systems/os2/misc/pmneuro.zip
    
    PMNEURO 1.0a creates neuronal networks (backpropagation); propagation
    results can be used as new training input for creating new networks and
    following propagation trials.
    

  32. nn/xnn

       Name: nn/xnn
    Company: Neureka ANS
    Address: Klaus Hansens vei 31B
             5037 Solheimsviken
             NORWAY
      Phone: +47 55 20 15 48
      Email: neureka@bgif.no
        URL: http://www.bgif.no/neureka/ 
    Operating systems: 
         nn: UNIX or MS-DOS, 
        xnn: UNIX/X-windows, UNIX flavours: OSF1, Solaris, AIX, IRIX, Linux (1.2.13)
    System requirements: Min. 20 Mb HD + 4 Mb RAM available. If only the
                         nn/netpack part is used (i.e. not the GUI), much
                         less is needed.
    Approx. price: Free for 30 days after installation, fully functional
                   After 30 days: USD 250,-
                   35% educational discount.
    
    A comprehensive shareware system for developing and simulating artificial neural networks. You can download the software from the URL given above.

    nn is a high-level neural network specification language. The current version is best suited for feed-forward nets, but recurrent models can and have been implemented as well. The nn compiler can generate C code or executable programs, with a powerful command line interface, but everything may also be controlled via the graphical interface (xnn). It is possible for the user to write C routines that can be called from inside the nn specification, and to use the nn specification as a function that is called from a C program. These features makes nn well suited for application development. Please note that no programming is necessary in order to use the network models that come with the system (netpack).

    xnn is a graphical front end to networks generated by the nn compiler, and to the compiler itself. The xnn graphical interface is intuitive and easy to use for beginners, yet powerful, with many possibilities for visualizing network data. Data may be visualized during training, testing or 'off-line'.

    netpack: A number of networks have already been implemented in nn and can be used directly: MAdaline, ART1, Backpropagation, Counterpropagation, Elman, GRNN, Hopfield, Jordan, LVQ, Perceptron, RBFNN, SOFM (Kohonen). Several others are currently being developed.

    The pattern files used by the networks, have a simple and flexible format, and can easily be generated from other kinds of data. The data file generated by the network, can be saved in ASCII or binary format. Functions for converting and pre-processing data are available.

  33. NNDT

    
                              NNDT
    
                  Neural Network Development Tool
                      Evaluation version 1.4
                           Bjvrn Saxen
                              1995
    
    http://www.abo.fi/~bjsaxen/nndt.html ftp://ftp.abo.fi/pub/vt/bjs/

    The NNDT software is as a tool for neural network training. The user interface is developed with MS Visual Basic 3.0 professional edition. DLL routines (written in C) are used for most of the mathematics. The program can be run on a personal computer with MS Windows, version 3.1.

    Evaluation version

    This evaluation version of NNDT may be used free of charge for personal and educational use. The software certainly contains limitations and bugs, but is still a working version which has been developed for over one year. Comments, bug reports and suggestions for improvements can be sent to:
            bjorn.saxen@abo.fi
    
    or
            Bjorn Saxen
            Heat Engineering Laboratory
            Abo Akademi University
            Biskopsgatan 8
            SF-20500 Abo
            Finland
    
    Remember, this program comes free but with no guarantee!

    A user's guide for NNDT is delivered in PostScript format. The document is split into three parts and compressed into a file called MANUAL.ZIP. Due to many bitmap figures included, the total size of the uncompressed files is very large, approx 1.5M.

    Features and methods

    The network algorithms implemented are of the so called supervised type. So far, algorithms for multi-layer perceptron (MLP) networks of feed-forward and recurrent types are included. The MLP networks are trained with the Levenberg-Marquardt method.

    The training requires a set of input signals and corresponding output signals, stored in a file referred to as pattern file. This is the only file the user must provide. Optionally, parameters defining the pattern file columns, network size and network configuration may be stored in a file referred to as setup file.

    NNDT includes a routine for graphical presentation of output signals, node activations, residuals and weights during run. The interface also provides facilities for examination of node activations and weights as well as modification of weights.

    A Windows help file is included, help is achieved at any time during NNDT execution by pressing F1.

    Installation

    Unzip NNDTxx.ZIP to a separate disk or to a temporary directory e.g. to c:\tmp. The program is then installed by running SETUP.EXE. See INSTALL.TXT for more details.

  34. Trajan 2.1 Shareware

    Trajan 2.1 Shareware is a Windows-based Neural Network simulation package. It includes support for the two most popular forms of Neural Network: Multilayer Perceptrons with Back Propagation and Kohonen networks.

    Trajan 2.1 Shareware concentrates on ease-of-use and feedback. It includes Graphs, Bar Charts and Data Sheets presenting a range of Statistical feedback in a simple, intuitive form. It also features extensive on-line Help.

    The Registered version of the package can support very large networks (up to 128 layers with up to 8,192 units each, subject to memory limitations in the machine), and allows simple Cut and Paste transfer of data to/from other Windows-packages, such as spreadsheet programs. The Unregistered version features limited network size and no Clipboard Cut-and-Paste.

    There is also a Professional version of Trajan 2.1, which supports a wider range of network models, training algorithms and other features.

    See Trajan Software's Home Page at http://www.trajan-software.demon.co.uk for further details, and a free copy of the Shareware version.

    Alternatively, email andrew@trajan-software.demon.co.uk for more details.

  35. Neural Networks at your Fingertips

    "Neural Networks at your Fingertips" is a package of ready-to-reuse neural network simulation source code which was prepared for educational purposes by Karsten Kutza. The package consists of eight programs, each of which implements a particular network architecture together with an embedded example application from a typical application domain.
    Supported network architectures are The applications demonstrate use of the networks in various domains such as pattern recognition, time-series forecasting, associative memory, optimization, vision, and control and include e.g. a sunspot prediction, the traveling salesman problem, and a pole balancer.
    The programs are coded in portable, self-contained ANSI C and can be obtained from the web pages at http://www.geocities.com/CapeCanaveral/1624.

  36. NNFit

    NNFit (Neural Network data Fitting) is a user-friendly software that allows the development of empirical correlations between input and output data. Multilayered neural models have been implemented using a quasi-newton method as learning algorithm. Early stopping method is available and various tables and figures are provided to evaluate fitting performances of the neural models. The software is available for most of the Unix platforms with X-Windows (IBM-AIX, HP-UX, SUN, SGI, DEC, Linux). Informations, manual and executable codes (english and french versions) are available at http://www.gch.ulaval.ca/~nnfit
    Contact: Bernard P.A. Grandjean, department of chemical engineering,
    Laval University; Sainte-Foy (Quibec) Canada G1K 7P4;
    grandjean@gch.ulaval.ca

  37. Nenet v1.0

    Nenet v1.0 is a 32-bit Windows 95 and Windows NT 4.0 application designed to facilitate the use of a Self-Organizing Map (SOM) algorithm.

    The major motivation for Nenet was to create a user-friendly SOM algorithm tool with good visualization capabilities and with a GUI allowing efficient control of the SOM parameters. The use scenarios have stemmed from the user's point of view and a considerable amount of work has been placed on the ease of use and versatile visualization methods.

    With Nenet, all the basic steps in map control can be performed. In addition, Nenet also includes some more exotic and involved features especially in the area of visualization.

    Features in Nenet version 1.0:

    Nenet web site is at: http://www.hut.fi/~jpronkko/nenet.html The web site contains further information on Nenet and also the downloadable Nenet files (3 disks totalling about 3 Megs)

    If you have any questions whatsoever, please contact: Nenet-Team@hut.fi or phassine@cc.hut.fi

  38. Machine Consciousness Toolbox

    See listing for Machine Consciousness Toolbox in part 6 of the FAQ.

  39. NICO Toolkit (speech recognition)

          Name: NICO Artificial Neural Network Toolkit
        Author: Nikko Strom
       Address: Speech, Music and Hearing, KTH, S-100 44, Stockholm, Sweden
         Email: nikko@speech.kth.se
           URL: http://www.speech.kth.se/NICO/index.html
     Platforms: UNIX, ANSI C; Source code tested on: HPUX, SUN Solaris, Linux
         Price: Free
    
    The NICO Toolkit is an artificial neural network toolkit designed and optimized for automatic speech recognition applications. Networks with both recurrent connections and time-delay windows are easily constructed. The network topology is very flexible -- any number of layers is allowed and layers can be arbitrarily connected. Sparse connectivity between layers can be specified. Tools for extracting input-features from the speech signal are included as well as tools for computing target values from several standard phonetic label-file formats.

    Algorithms:

------------------------------------------------------------------------

For some of these simulators there are user mailing lists. Get the packages and look into their documentation for further info.

If you are using a small computer (PC, Mac, etc.) you may want to have a look at the Central Neural System Electronic Bulletin Board (see question "Other sources of information"). Modem: 409-737-5222; Sysop: Wesley R. Elsberry; 4160 Pirates' Beach, Galveston, TX, USA; welsberr@orca.tamu.edu. There are lots of small simulator packages, the CNS ANNSIM file set. There is an ftp mirror site for the CNS ANNSIM file set at me.uta.edu [129.107.2.20] in the /pub/neural directory. Most ANN offerings are in /pub/neural/annsim.

------------------------------------------------------------------------
Next part is part 6 (of 7). Previous part is part 4.