Back

Explore Courses Blog Tutorials Interview Questions
+1 vote
5 views
in AI and Deep Learning by (3.4k points)
edited by

I can understand that we have to feed the activation of artificial neurons in the last layer of the TensorFlow function given below, but why we call it logits? I think it's a mathematical function:

loss_function = tf.nn.softmax_cross_entropy_with_logits(
     logits = last_layer,
     labels = target_output
)

Can anyone answer this?

closed

2 Answers

+1 vote
by (46k points)
edited by
 
Best answer

Logits are values that are used as input to softmax. To understand this better click here this is official by tensorflow.

Hence, Logits are used to map probabilities [0,1] to R [-inf, inf]

L=ln(p/1-p)      p=1/1+e^-L

From limits we conclude that the probability approaching to 0.5 corresponds to a logits of 0. Therefore, +ive logits correspond to probability of greater than 0.5 and negative corresponds to a probability value of less than 0.5.

Sometimes they are also refer to inverse of sigmoid function. It's mainly used in Statistics.

Become Master of Tenserflow by going through this online Artificial Intelligence Course.

+4 votes
by (10.9k points)
edited by

Logits is a simple term which can mean many different things: In Mathematics, Logitis a function that is used to maps probabilities ( [0, 1] ) to R ( (-inf, inf) ) .

Ex-

 N=ln M - ln (1-M) , where M=1/(1+e^-N)

Ex-

 A logits of 0 corresponds to a probability of 0.5.

A negative logit corresponds to a probability of less than 0.5.

A positive logit corresponds to a probability greater than 0.5.

In ML, logits can be defined as a vector of raw predictions that a classification model generates and it is passed to a normalized function. In case the model is solving a classification problem having multiple class, logits will behave as an input to the softmax function. Then the softmax function generates a vector of probability having one value for each class. Logits sometime also refer to the element-wise inverse of the sigmoid function.

Hope this answer helps.

Browse Categories

...