Back

Explore Courses Blog Tutorials Interview Questions
+1 vote
3 views
in Machine Learning by (200 points)

In tensorflow API documentation here one keyword was there logits. What is logits? In most of the documents I went through it was written as:

tf.nn.softmax(logits, name=None)

If logits are only Tensors then why they have different name?
Lastly, I would like to know the difference between two methods:

tf.nn.softmax(logits, name=None)
tf.nn.softmax_cross_entropy_with_logits(logits, labels, name=None)

1 Answer

+2 votes
by (10.9k points)
edited by

@varsha ,I hope this answer will help you.

Logits is a function which operates on the unscaled output of earlier layers and on a linear scale to understand the linear units. In Mathematics, Logits is a function that maps probabilities ( [0, 1] ) to R ( (-inf, inf) ) .

 tf.nn.softmax  gives only the result of applying the softmax function to an input tensor. The softmax "squishes" the inputs so that sum(input) = 1,it is a simple way of normalizing. Moreover, the shape of output is the same as that of input, it just normalizes the values.It is used during the evaluation of the model when you compute the probabilities that the model outputs.

a = tf.constant(np.array([[.1, .3, .5, .9]]))

 print s.run(tf.nn.softmax(a))

[[ 0.16838508  0.205666    0.25120102  0.37474789]

tf.nn.softmax_cross_entropy_with_logits  is mainly used for computing the cross entropy of the result after the softmax function has been applied. It is only used during training.Its result is similar to-

sf = tf.nn.softmax(x)

 c = cross_entropy(sf)

Ex- If tf.nn.softmax_cross_entropy_with_logits is applied on a shape [2,5] gives a output shape of[2,1] .

In case, you want to minimize the cross-entropy and you are softmaxing after your last layer then you should use tf.nn.softmax_cross_entropy_with_logits as it covers numerically unstable corner cases.

To Learn application of machine learning  , types of machine learning go through this machine learning tutorial.

Browse Categories

...