Archive-name: ai-faq/neural-nets/part5 Last-modified: 1998-02-24 URL: ftp://ftp.sas.com/pub/neural/FAQ5.html Maintainer: saswss@unx.sas.com (Warren S. Sarle)
This is part 5 (of 7) of a monthly posting to the Usenet newsgroup comp.ai.neural-nets. See the part 1 of this posting for full information what it is all about.
------------------------------------------------------------------------
Note for future submissions: Please restrict software descriptions to a maximum of 60 lines of 72 characters, in either plain-text format or, preferably, HTML format. If you include the standard header (name, company, address, etc.), you need not count the header in the 60 line maximum. Please confine your HTML to features that are supported by most browsers, especially NCSA Mosaic 2.0; avoid tables, for example--use <pre> instead. Try to make the descriptions objective, and avoid making implicit or explicit assertions about competing products, such as "Our product is the *only* one that does so-and-so" or "Our innovative product trains bigger nets faster." The FAQ maintainer reserves the right to remove excessive marketing hype and to edit submissions to conform to size requirements; if he is in a good mood, he may also correct spelling and punctuation.
The following simulators are described below:
ftp retina.cs.ucla.edu [131.179.16.6]; Login name: sfinxftp; Password: joshua; directory: pub; files : README; sfinx_v2.0.tar.Z; Email info request : sfinx@retina.cs.ucla.edu
Currently supports backpropagation (vanilla, online, with momentum term and flat spot elimination, batch, time delay), counterpropagation, quickprop, backpercolation 1, generalized radial basis functions (RBF), RProp, ART1, ART2, ARTMAP, Cascade Correlation, Recurrent Cascade Correlation, Dynamic LVQ, Backpropagation through time (for recurrent networks), batch backpropagation through time (for recurrent networks), Quickpropagation through time (for recurrent networks), Hopfield networks, Jordan and Elman networks, autoassociative memory, self-organizing maps, time-delay networks (TDNN), RBF_DDA, simulated annealing, Monte Carlo, Pruned Cascade-Correlation, Optimal Brain Damage, Optimal Brain Surgeon, Skeletonization, and is user-extendable (user-defined activation functions, output functions, site functions, learning procedures). C code generator snns2c.
Works on SunOS, Solaris, IRIX, Ultrix, OSF, AIX, HP/UX, NextStep, Linux, and Windows 95/NT. Distributed kernel can spread one learning run over a workstation cluster.
SNNS web page: http://www-ra.informatik.uni-tuebingen.de/SNNS
Ftp server: ftp://ftp.informatik.uni-tuebingen.de/pub/SNNS
The software is available from two FTP sites: from CMU's simulator collection on pt.cs.cmu.edu [128.2.254.155] in /afs/cs/project/connect/code/unsupported/am6.tar.Z and from UCLA's cognitive science machine ftp.cognet.ucla.edu [128.97.50.19] in /pub/alexis/am6.tar.Z (2 MB).
The Atree 3.0 Educational Kit (EK) serves to develop simple applications using adaptive Logic Networks (ALNs). In an ALN, logic functions AND and OR make up all hidden layers but the first, which uses simple perceptrons. Though this net can't compute real-valued outputs, since its outputs are strictly boolean, it can easily and naturally represent real valued functions by giving a 0 above the function's graph and a 1 otherwise. This approach is extremely useful, since it allows the user to impose constraints on the functions to be learned (monotonicity, bounds on slopes, convexity,...). Very rapid computation of functions is done by an ALN decision tree at whose leaves are small expressions of minimum and maximum operations acting on linear functions.
Two simple languages describe ALNs and the steps of training an ALN. Execution software for ALN decision trees resulting from training is provided in C source form for experimenters. EK and a "120-page" User's Guide are obtained by anonymous ftp from ftp.cs.ualberta.ca in directory /pub/atree/atree3/. Get the file atree3ek.exe (~900K) .
The above User's Guide with an introduction to basic ALN theory is available on WWW at http://www.cs.ualberta.ca/~arms/guide/ch0.htm . This Educational Kit software is the same as the commercial Atree 3.0 program except that it allows only two input variables and is licensed for educational uses only. A built-in 2D and 3D plotting capability is useful to help the user understand how ALNs work.
Features: Full GUI (InterViews), realtime network viewer, data viewer, extendable object-oriented design, CSS scripting language with source-level debugger, GUI macro recording.
Algorithms: Feedforward and several recurrent BP, Boltzmann machine, Hopfield, Mean-field, Interactive activation and competition, continuous stochastic networks.
The software can be obtained by anonymous ftp from ftp://cnbc.cmu.edu/pub/pdp++/ and from ftp://unix.hensa.ac.uk/mirrors/pdp++/.
For more information, see our WWW page at
http://www.cnbc.cmu.edu/PDP++/PDP++.html.
There is a 250 page (printed) manual and an HTML version available
on-line at the above address.
The modules are provided in a library. Several "front-ends" or clients are also available. X-Window support by editor/visualization tool Xmume. MUME can be used to include non-neural computing modules (decision trees, ...) in applications. MUME is available for educational institutions by anonymous ftp on mickey.sedal.su.oz.au [129.78.24.170] after signing and sending a licence: /pub/license.ps (67 kb).
Contact:
Marwan Jabri, SEDAL, Sydney University Electrical Engineering,
NSW 2006 Australia, marwan@sedal.su.oz.au
NevProp is a free, easy-to-use feedforward backpropagation (multilayer perceptron) program. It uses an interactive character-based interface, and is distributed as C source code that should compile and run on most platforms. (Precompiled executables are available for Macintosh and DOS.) The original version was Quickprop 1.0 by Scott Fahlman, as translated from Common Lisp by Terry Regier. We added early-stopped training based on a held-out subset of data, c index (ROC curve area) calculation, the ability to force gradient descent (per-epoch or per-pattern), and additional options. FEATURES (NevProp version 1.16): UNLIMITED (except by machine memory) number of input PATTERNS; UNLIMITED number of input, hidden, and output UNITS; Arbitrary CONNECTIONS among the various layers' units; Clock-time or user-specified RANDOM SEED for initial random weights; Choice of regular GRADIENT DESCENT or QUICKPROP; Choice of PER-EPOCH or PER-PATTERN (stochastic) weight updating; GENERALIZATION to a test dataset; AUTOMATICALLY STOPPED TRAINING based on generalization; RETENTION of best-generalizing weights and predictions; Simple but useful GRAPHIC display to show smoothness of generalization; SAVING of results to a file while working interactively; SAVING of weights file and reloading for continued training; PREDICTION-only on datasets by applying an existing weights file; In addition to RMS error, the concordance, or c index is displayed. The c index (area under the ROC curve) shows the correctness of the RELATIVE ordering of predictions AMONG the cases; ie, it is a measure of discriminative power of the model. AVAILABILITY: See: ftp://ftp.scs.unr.edu/pub/cbmr/nevpropdir/index.html Version 2 has some new features: more flexible file formatting (including access to external data files; option to prerandomize data order; randomized stochastic gradient descent; option to rescale predictor (input) variables); linear output units as an alternative to sigmoidal units for use with continuous-valued dependent variables (output targets); cross-entropy (maximum likelihood) criterion function as an alternative to square error for use with categorical dependent variables (classification/symbolic/nominal targets); and interactive interrupt to change settings on-the-fly. Version 4 will be released in 1998. Limited support is available from Phil Goodman (goodman@unr.edu), University of Nevada Center for Biomedical Research.
[backprop, quickprop, delta-bar-delta, recurrent networks], [simple clustering, k-nearest neighbor, LVQ1, DSM], [Hopfield, Boltzman, interactive activation network], [interactive activation network], [feedforward counterpropagation], [ART I], [a simple BAM] and [the linear pattern classifier]For details see: Basis of AI NN software at http://www.mcs.com/~drt/svbp.html .
An improved professional version of backprop is also available, $30 for regular people, $200 for businesses and governmental agencies. See: Basis of AI Professional Backprop at http://www.mcs.com/~drt/probp.html .
Questions to: Don Tveter, drt@mcs.com
REQUIREMENTS (Unix version): X11 Rel. 3 and above, Motif Rel 1.0 and above, 12 MB of physical memory, recommended are 24 MB and more, 20 MB disc space. REQUIREMENTS (PC version): PC-compatible with MS Windows 3.0 and above, 4 MB of physical memory, recommended are 8 MB and more, 1 MB disc space.
Four neuron models are implemented in BIOSIM: a simple model only switching ion channels on and off, the original Hodgkin-Huxley model, the SWIM model (a modified HH model) and the Golowasch-Buchholz model. Dendrites consist of a chain of segments without bifurcation. A neural network can be created by using the interactive network editor which is part of BIOSIM. Parameters can be changed via context sensitive menus and the results of the simulation can be visualized in observation windows for neurons and synapses. Stochastic processes such as noise can be included. In addition, biologically orientied learning and forgetting processes are modeled, e.g. sensitization, habituation, conditioning, hebbian learning and competitive learning. Three synaptic types are predefined (an excitatatory synapse type, an inhibitory synapse type and an electrical synapse). Additional synaptic types can be created interactively as desired.
Available for ftp from ftp.uni-kl.de in directory /pub/bio/neurobio: Get /pub/bio/neurobio/biosim.readme (2 kb) and /pub/bio/neurobio/biosim.tar.Z (2.6 MB) for the Unix version or /pub/bio/neurobio/biosimpc.readme (2 kb) and /pub/bio/neurobio/biosimpc.zip (150 kb) for the PC version.
Contact:
Stefan Bergdoll
Department of Software Engineering (ZXA/US)
BASF Inc.
D-67056 Ludwigshafen; Germany
bergdoll@zxa.basf-ag.de phone 0621-60-21372 fax 0621-60-43735
The Brain requires 512K memory and MS-DOS or PC-DOS version 3.20 or later (versions for other OS's and machines are available). A 386 (with maths coprocessor) or higher is recommended for serious use of The Brain. Shareware payment required.
Demo version is restricted to number of units the network can handle due to memory contraints on PC's. Registered version allows use of extra memory.
External documentation included: 39Kb, 20 Pages.
Source included: No (Source comes with registration).
Available via anonymous ftp from
ftp.tu-clausthal.de as
/pub/msdos/science/brain12.zip (78 kb)
and from ftp.technion.ac.il as
/pub/contrib/dos/brain12.zip (78 kb)
Contact:
David Perkovic
DP Computing
PO Box 712
Noarlunga Center SA 5168
Australia
Email: dip@mod.dsto.gov.au (preferred) or dpc@mep.com or
perkovic@cleese.apana.org.au
NeuDL is available from the anonymous ftp site at The University of Alabama: cs.ua.edu (130.160.44.1) in the file /pub/neudl/NeuDLver021.tar. The tarred file contains the interpreter source code (in C++) a user manual, a paper about NeuDL, and about 25 sample NeuDL programs. A document demonstrating NeuDL's capabilities is also available from the ftp site: /pub/neudl/NeuDL/demo.doc /pub/neudl/demo.doc. For more information contact the author: Joey Rogers (jrogers@buster.eng.ua.edu).
DemoGNG can be accessed most easily at http://www.neuroinformatik.ruhr-uni-bochum.de/ in the file /ini/VDM/research/gsn/DemoGNG/GNG.html where it is embedded as Java applet into a Web page and is downloaded for immediate execution when you visit this page. An accompanying paper entitled "Some competitive learning methods" describes the implemented models in detail and is available in html at the same server in the directory ini/VDM/research/gsn/JavaPaper/.
It is also possible to download the complete source code and a Postscript version of the paper via anonymous ftp from ftp.neuroinformatik.ruhr-uni-bochum [134.147.176.16] in directory /pub/software/NN/DemoGNG/. The software is in the file DemoGNG-1.00.tar.gz (193 KB) and the paper in the file sclm.ps.gz (89 KB). There is also a README file (9 KB). Please send any comments and questions to demogng@neuroinformatik.ruhr-uni-bochum.de which will reach Hartmut Loos who has written DemoGNG as well as Bernd Fritzke, the author of the accompanying paper.
PMNEURO 1.0a is available at: ftp://ftp.uni-stuttgart.de/pub/systems/os2/misc/pmneuro.zip PMNEURO 1.0a creates neuronal networks (backpropagation); propagation results can be used as new training input for creating new networks and following propagation trials.
Name: nn/xnn Company: Neureka ANS Address: Klaus Hansens vei 31B 5037 Solheimsviken NORWAY Phone: +47 55 20 15 48 Email: neureka@bgif.no URL: http://www.bgif.no/neureka/ Operating systems: nn: UNIX or MS-DOS, xnn: UNIX/X-windows, UNIX flavours: OSF1, Solaris, AIX, IRIX, Linux (1.2.13) System requirements: Min. 20 Mb HD + 4 Mb RAM available. If only the nn/netpack part is used (i.e. not the GUI), much less is needed. Approx. price: Free for 30 days after installation, fully functional After 30 days: USD 250,- 35% educational discount.A comprehensive shareware system for developing and simulating artificial neural networks. You can download the software from the URL given above.
nn is a high-level neural network specification language. The current version is best suited for feed-forward nets, but recurrent models can and have been implemented as well. The nn compiler can generate C code or executable programs, with a powerful command line interface, but everything may also be controlled via the graphical interface (xnn). It is possible for the user to write C routines that can be called from inside the nn specification, and to use the nn specification as a function that is called from a C program. These features makes nn well suited for application development. Please note that no programming is necessary in order to use the network models that come with the system (netpack).
xnn is a graphical front end to networks generated by the nn compiler, and to the compiler itself. The xnn graphical interface is intuitive and easy to use for beginners, yet powerful, with many possibilities for visualizing network data. Data may be visualized during training, testing or 'off-line'.
netpack: A number of networks have already been implemented in nn and can be used directly: MAdaline, ART1, Backpropagation, Counterpropagation, Elman, GRNN, Hopfield, Jordan, LVQ, Perceptron, RBFNN, SOFM (Kohonen). Several others are currently being developed.
The pattern files used by the networks, have a simple and flexible format, and can easily be generated from other kinds of data. The data file generated by the network, can be saved in ASCII or binary format. Functions for converting and pre-processing data are available.
NNDT Neural Network Development Tool Evaluation version 1.4 Bjvrn Saxen 1995http://www.abo.fi/~bjsaxen/nndt.html ftp://ftp.abo.fi/pub/vt/bjs/
The NNDT software is as a tool for neural network training. The user interface is developed with MS Visual Basic 3.0 professional edition. DLL routines (written in C) are used for most of the mathematics. The program can be run on a personal computer with MS Windows, version 3.1.
bjorn.saxen@abo.fior
Bjorn Saxen Heat Engineering Laboratory Abo Akademi University Biskopsgatan 8 SF-20500 Abo FinlandRemember, this program comes free but with no guarantee!
A user's guide for NNDT is delivered in PostScript format. The document is split into three parts and compressed into a file called MANUAL.ZIP. Due to many bitmap figures included, the total size of the uncompressed files is very large, approx 1.5M.
The training requires a set of input signals and corresponding output signals, stored in a file referred to as pattern file. This is the only file the user must provide. Optionally, parameters defining the pattern file columns, network size and network configuration may be stored in a file referred to as setup file.
NNDT includes a routine for graphical presentation of output signals, node activations, residuals and weights during run. The interface also provides facilities for examination of node activations and weights as well as modification of weights.
A Windows help file is included, help is achieved at any time during NNDT execution by pressing F1.
Trajan 2.1 Shareware concentrates on ease-of-use and feedback. It includes Graphs, Bar Charts and Data Sheets presenting a range of Statistical feedback in a simple, intuitive form. It also features extensive on-line Help.
The Registered version of the package can support very large networks (up to 128 layers with up to 8,192 units each, subject to memory limitations in the machine), and allows simple Cut and Paste transfer of data to/from other Windows-packages, such as spreadsheet programs. The Unregistered version features limited network size and no Clipboard Cut-and-Paste.
There is also a Professional version of Trajan 2.1, which supports a wider range of network models, training algorithms and other features.
See Trajan Software's Home Page at http://www.trajan-software.demon.co.uk for further details, and a free copy of the Shareware version.
Alternatively, email andrew@trajan-software.demon.co.uk for more details.
The major motivation for Nenet was to create a user-friendly SOM algorithm tool with good visualization capabilities and with a GUI allowing efficient control of the SOM parameters. The use scenarios have stemmed from the user's point of view and a considerable amount of work has been placed on the ease of use and versatile visualization methods.
With Nenet, all the basic steps in map control can be performed. In addition, Nenet also includes some more exotic and involved features especially in the area of visualization.
Features in Nenet version 1.0:
Nenet web site is at: http://www.hut.fi/~jpronkko/nenet.html The web site contains further information on Nenet and also the downloadable Nenet files (3 disks totalling about 3 Megs)
If you have any questions whatsoever, please contact: Nenet-Team@hut.fi or phassine@cc.hut.fi
Name: NICO Artificial Neural Network Toolkit Author: Nikko Strom Address: Speech, Music and Hearing, KTH, S-100 44, Stockholm, Sweden Email: nikko@speech.kth.se URL: http://www.speech.kth.se/NICO/index.html Platforms: UNIX, ANSI C; Source code tested on: HPUX, SUN Solaris, Linux Price: FreeThe NICO Toolkit is an artificial neural network toolkit designed and optimized for automatic speech recognition applications. Networks with both recurrent connections and time-delay windows are easily constructed. The network topology is very flexible -- any number of layers is allowed and layers can be arbitrarily connected. Sparse connectivity between layers can be specified. Tools for extracting input-features from the speech signal are included as well as tools for computing target values from several standard phonetic label-file formats.
Algorithms:
------------------------------------------------------------------------
For some of these simulators there are user mailing lists. Get the packages and look into their documentation for further info.
If you are using a small computer (PC, Mac, etc.) you may want to have a look at the Central Neural System Electronic Bulletin Board (see question "Other sources of information"). Modem: 409-737-5222; Sysop: Wesley R. Elsberry; 4160 Pirates' Beach, Galveston, TX, USA; welsberr@orca.tamu.edu. There are lots of small simulator packages, the CNS ANNSIM file set. There is an ftp mirror site for the CNS ANNSIM file set at me.uta.edu [129.107.2.20] in the /pub/neural directory. Most ANN offerings are in /pub/neural/annsim.
------------------------------------------------------------------------Next part is part 6 (of 7). Previous part is part 4.