Magi

Contact info
Word
Research
   Publications
Studies
Free Software
Hobbies
Articles
Photography
About me
   Curriculum Vitae

©Marko Grönroos, 1998

USENET News comp.ai.neural-nets

Säie: Inanna ANN library available at SourceForge

[Muut säikeet] [Muut uutisryhmät]
Newsgroups: comp.ai.neural-nets
Subject: Inanna ANN library available at SourceForge
From: magi AT iki PISTE fi (Marko Grönroos)
Date: 11 Nov 2000 06:00:39 +0200

Inanna is an ANN library I've been developing for a few years for
research purposes. It's HIGHLY object-oriented design, and written in
C++ (g++) in Solaris and Linux. It's just gone through heavy changes,
and some restructuring is yet to be done, so it should be considered
as VERY ALPHA VERSION!!!

            http://sourceforge.net/projects/inanna/

The OO design is somewhat open and generic, which has been my main
design goal. Some main features:

         * Object-oriented network structure (network, neurons, connections)
         * Pattern set objects
         * Pattern set file format objects
         * Equalization and unequalization (!) objects
         * Training algorithm objects
         * Early stopping objects
         * Network visualization objects (partially implemented)
         * Some other stuff
         * Some example projects

The architecture is such that user-defined components should be *very*
easy to implement.

Users and especially developers are welcome. The library is LGPL
licensed, and CVS accounts are available at SourceForge.net.

It uses the GNU "./configure ; make ; make install" build system,
although you probably won't get it that easy... If you try it, PLEASE
tell me if you could get it compiled or even working, and what
problems you had (I KNOW you'll have many). I really don't know how
easy it is to use...

It has some nice documentation written with LyX, but it's not at
SourceForge quite yet.

History: I developed the previous version for a study involving
evolutionary design of neural networks. That part of the project is
currently broken though, because of various library
incompatibilities. But...maybe soon... Previously it used SNNS for
training, but now it implements two training algorithms: Backprop and
Resilient Backprop. Not quite as fast as SNNS, but almost (I'm working
on that).

--
-- Marko Grönroos, magi AT iki PISTE fi (http://www.iki.fi/magi/)
-- Paradoxes are the source of truth and the end of wisdom

[Muut säikeet] [Muut uutisryhmät]