Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Machine Learning by (19k points)

I want to upgrade my evolution simulator to use Hebb learning, like this one. I basically want small creatures to be able to learn how to find food. I achieved that with the basic feedforward networks, but I'm stuck at understanding how to do it with Hebb learning. The basic principle of Hebb learning is that, if two neurons fire together, they were together.

So, the weights are updated like this:

weight_change = learning_rate * input * output

The information I've found on how this can be useful is pretty scarce, and I don't get it.

In my current version of the simulator, the weights between an action and an input (movement, eyes) are increased when a creature eats a piece of food, and I fail to see how that can translate into this new model. There simply is no room to tell if it did something right or wrong here because the only parameters are input and output! Basically, if one input activates movement in one direction, the weight would just keep on increasing, no matter if the creature is eating something or not!

Am I applying Hebb learning in the wrong way? Just for reference, I'm using Python.

1 Answer

0 votes
by (33.1k points)

Hebbs law is a brilliant idea for associative learning. It is used to add a kind of normalization or limiting process

The basic idea is, it divides through a weight by the sum of all the weights converging on the same post-synaptic neuron (i.e. the sum of all weights converging on a neuron is fixed at 1).

You can implement it, by building a network that utilizes Hebbian learning. 

All the neurons may be interconnected in this kind of layer. You start to bypass the input vector and allow the activity in the network to settle. The weights will be updated on each iteration. You perform this during the training phase, at the end of which associated items in the input space will tend to form grouped activity patches in the output map.

Hope this answer helps.

Browse Categories

...