Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AI and Deep Learning by (50.2k points)

I'm creating a neural network using the backpropagation technique for learning.

I understand we need to find the derivative of the activation function used. I'm using the standard sigmoid function

f(x) = 1 / (1 + e^(-x))

and I've seen that its derivative is

dy/dx = f(x)' = f(x) * (1 - f(x))

This may be a daft question, but does this mean that we have to pass x through the sigmoid function twice during the equation, so it would expand to

dy/dx = f(x)' = 1 / (1 + e^(-x)) * (1 - (1 / (1 + e^(-x))))

or is it simply a matter of taking the already calculated output of f(x), which is the output of the neuron, and replace that value for f(x)?

1 Answer

0 votes
by (108k points)

I would suggest trying to take the derivative yourself. With a bit of algebra you can derive exactly f(x) * (1 - f(x)), and then you'll understand exactly what is going on. 

The two ways of doing it are equivalent (since mathematical functions don't have side-effects and always return the same input for a given output), so you might as well do it the (faster) second way.

You can refer the following link:

https://towardsdatascience.com/derivative-of-the-sigmoid-function-536880cf918e

Become a master of Artificial Intelligence by going through this online AI Course!

Browse Categories

...