In the tensorflow API docs they use a keyword called logits
. What is it? A lot of methods are written like:
tf.nn.softmax(logits, name=None)
If logits
is just a generic Tensor
input, why is it named logits
?
Secondly, what is the difference between the following two methods?
tf.nn.softmax(logits, name=None)
tf.nn.softmax_cross_entropy_with_logits(logits, labels, name=None)
I know what tf.nn.softmax
does, but not the other. An example would be really helpful.