Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AI and Deep Learning by (50.2k points)

I'm looking for a neural network model with specific characteristics. This model may not exist...

I need a network that doesn't use "layers" as traditional artificial neural networks do. Instead, I want [what I believe to be] a more biological model.

This model will house a large cluster of interconnected neurons, like the image below. A few neurons (at bottom of diagram) will receive input signals, and a cascade effect will cause successive, connected neurons to possibly fire depending on signal strength and connection weight. This is nothing new, but there are no explicit layers...just more and more distant, indirect connections.

As you can see, I also have the network divided into sections (circles). Each circle represents a semantic domain (a linguistics concept) which is the core information surrounding a concept; essentially a semantic domain is a concept.

Connections between nodes within a section have higher weights than connections between nodes of different sections. So the nodes for "car" are more connected to one another than nodes connecting "English" to "car". Thus, when a neuron in a single section fires (is activated), it is likely that the entire (or most of) the section will also be activated.

All in all, I need output patterns to be used as input for further output, and so on. A cascade effect is what I am after.

I hope this makes sense. Please ask for clarification where needed.

Are there any suitable models in existence that model what I've described, already?

enter image description here

1 Answer

0 votes
by (107k points)

You can join the nodes to each other any way you like from layer to another. You can neglect some unconnected by simple using constant zero as a weight between them, or if object-oriented programming is used, simply leave unwanted connections out of the connection phase. Skipping the layers might be difficult with a standard NN-model, but one way could be using a dummy node for each layer a weight needs to cross. Just copying the original output*weight -value from node to dummy would be the same as skipping a layer and this would also keep the standard NN-model intact.

Your network imitates this http://nn.cs.utexas.edu/?fullmer:evolving

but it doesn't provide the network to learn, but be replaced, which may be narrated in the following link:

http://www.alanturing.net/turing_archive/pages/reference articles/connectionism/Turing's neural networks.html

If you wish to know more about Neural Network visit this Neural Network Tutorial.

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...