2 views

This is one thing in my beginning of understanding neural networks is I don't quite understand what to initially set a "bias" at? I understand the Perceptron calculates it's output based on:

P * W + b > 0

and then you could calculate a learning pattern based on b = b + [ G - O ] where G is the Correct Output, and O is the actual Output (1 or 0) to calculate a new bias...but what about an initial bias.....I don't understand how this is calculated, or what initial value should be used besides just "guessing", is there any type of formula for this?

Pardon if I'm mistaken on anything, I'm still learning the whole Neural network idea before I implement my own (crappy) one.

The same goes for learning rate.....I mean most books and such just kinda "pick one" for μ.

by (108k points)

Bias is like the intercept attached in a linear equation. It is an additional parameter in the Neural Network which is used to modify the output along with the weighted sum of the inputs to the neuron. Hence Bias is a constant which helps the model in a way that it can fit best for the given data.

The processing is performed by a neuron is thus denoted as :

output  = sum (weights * inputs) + bias

However, you can also set the weights manually (with no training) to get some special behaviors: for example, you can use the bias to make a perceptron behave like a logic gate (assume binary inputs X1 and X2 are either 0 or 1, and the activation function is scaled to give an output of 0 or 1).

OR gate: W1=1, W2=1, Bias=0

AND gate: W1=1, W2=1, Bias=-1