Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AI and Deep Learning by (50.2k points)

One of the most popular questions regarding Neural Networks seem to be:

Help!! My Neural Network is not converging!!

So after eliminating any error in the implementation of the network, What are the most common things one should try??

I know that the things to try would vary widely depending on network architecture. But tweaking which parameters (learning rate, momentum, initial weights, etc) and implementing what new features (windowed momentum?) were you able to overcome some similar problems while building your own neural net?

Please give answers which are language agnostic if possible. This question is intended to give some pointers to people stuck with neural nets that are not converging..

1 Answer

0 votes
by (108k points)

In spite of applying our greatest efforts at planning and coaching the neural network, typically a specific network merely won’t converge on an answer that's acceptable to the system necessities.

It could get close, but not meet our requirements.

Now want to know how Network Convergence Fails

This outcome will happen as a result of there aren’t enough nodes to remodel the computer file into correct outputs.

Perhaps you don’t have enough coaching knowledge, or the coaching knowledge wasn’t collected with knowledge integrity in mind.

Here are 37 Reasons why your Neural Network is not working.

You can try the following steps, as to resolve your problem occurred in the back prop neural network :

  • Implemented momentum (and kept the value at 0.5)

  • Kept the learning rate at 0.1

  • Charted the error, weights, input as well as output of each and every neuron, Seeing the data as a graph is more helpful in figuring out what is going wrong

  • Tried out different activation functions (all sigmoid). But this did not help me much.

  • Initialized all weights to random values between -1 and 1.

Since you were implementing a neural network from scratch, it turned out that there was an error in the update function. You can found your error through gradient checking.

Browse Categories

...