tensorflow have the loss function “tf.nn.sparse_softmax_cross_entropy_with_logits”. I saw an implementation of log function like this
def log_loss(logits,labels,name)
SE_loss = tf.nn.sparse_softmax_cross_entropy_with_logits
return tf.reduce_sum(SE_loss(logits=logits, labels=label), [1, 2], name=name)
What would be the equivalent loss in pytorch and how to use this loss to have exact same answer as of the log_loss function provided given same logits and labels tensors
Here is how i am testing it but the pytorch cross entropy loss is not working
input = torch.randn((20, 30, 30,2), requires_grad=True)
target = torch.empty((20,30,30), dtype=torch.long).random_(1)
output = loss(input, target)
logits= input.detach().numpy()
labels= target.detach().numpy()
graph = tf.Graph()
with tf.Session(graph=graph) as sess:
output_tf= tf.nn.sparse_softmax_cross_entropy_with_logits(labels=tf.convert_to_tensor(labels, dtype=tf.int32), logits=tf.convert_to_tensor(logits, dtype=tf.float32))
output_tf= tf.reduce_sum(output_tf, [1, 2], name="tfloss")
print("*"*50)
print("tensorflow: ",sess.run(output_tf))
print(output)
print("*"*50)#
I get this error
RuntimeError: Assertion `input0 == target0 && input2 == target1 && input3 == target2’ failed. size mismatch (got input: 20x30x30x2, target: 20x30x30) at /pytorch/aten/src/THNN/generic/SpatialClassNLLCriterion.c:61
When i run simply tensorflow part, i get following type of results.
tensorflow: [782.8956 814.34094 819.82416 814.04565 804.5945 817.7538 816.1295
776.78687 785.20825 801.3092 802.24805 802.3621 846.7362 792.4359
768.00555 820.16675 782.90857 846.25085 793.9464 799.1505 ]
Length of the output is 20
I want to manipulate my input and output for pytorch cross entropy such that results of tensorflow code and pytorch cross entropy code come out as same. Looking forward to hearing from you. Thanks in advance for putting thought into it.