Magi

Contact info
Word
Research
   Publications
Studies
Free Software
Hobbies
Articles
Photography
About me
   Curriculum Vitae

©Marko Grönroos, 1998

USENET News comp.ai.genetic

Säie: NN evolving techniques

Edellinen säie: Mutation of probabilities?
Seuraava säie: GA and Neural Networks
[Muut säikeet] [Muut uutisryhmät]
From: magi AT iki PISTE fi (Marko Grönroos)
Newsgroups: comp.ai.genetic,comp.ai.neural-nets
Subject: Re: NN evolving techniques
Date: 15 Apr 1999 20:37:17 +0200

"Gen" <Julien PISTE Calas AT insa-lyon.fr> writes:
> I was reading a few papers on GA evolving NN, and all the methods I read
> (direct encoding, Kitano) only tell how the connexions evolve - not the
> connexions weights

For direct encoding, the solution is trivial: just add a real-valued
gene for each connection that described its weight. You can encode the
weights as a real-valued vector, or as binary-encoded real values, if
you prefer binary genetic representation.

For Kitano's Graph Generation Grammar, you could use a variation of
the "preterminal" productions where the left-hand-side is {a...p}, and
right-hand-size is a 2x2 matrix of REAL values (not binary, as in the
original model). I think you could generate those matrices purely
randomly, or you might even encode them in the genome and then evolve
them along with the discrete productions...
      I don't have a clear idea about how you should rewrite the terminal
real values. Maybe just copy them. That may produce large areas of the
same weights, but I'm not sure if that's bad.
      Then interpret the final connection matrix as connection weights,
not the existances of the connections. If you want to encode both
weights are topology, you'll have to encode them both in the
genome. (I haven't tried this, or don't know if anyone has, but it's a
rather obvious extension to the method).
      This is just an extension, and definitely isn't a very elegant way
to do it. I have thought of better generative encodings, but... I'll
try them out some day soon and publish them if they work nicely.

I also think that the Kitano's original GGG is a bit too complex
encoding (why have so many different production rule sets? -> just use
one set and a special interpretation of the final matrix).

Kitano has also a newer encoding method, from 1994. I remember it
has some sort of weight encoding.

> ; that is, they only code the position of the neurons
> which are linked to others. Could anyone tell me a way to encode the
> connexion weights in the chromosomes, since the error of the network is not
> available in the problem I'm trying to solve - and consequently I can only
> change weights through evolution techniques ?

I don't really understand this. How can you expect to use EA if you
don't have an error value that you can use as a fitness value?

My own very modest evolutionary ANN research is available at:
      http://www.iki.fi/magi/opinnot/gradu/

--
-- Marko Grönroos, magi AT iki PISTE fi (http://www.iki.fi/magi/)
-- Paradoxes are the source of truth and the end of wisdom

Edellinen säie: Mutation of probabilities?
Seuraava säie: GA and Neural Networks
[Muut säikeet] [Muut uutisryhmät]