(Cross entropy loss) accuracy on siamese net always seems 50%

I have a siamese network which I am training on positive and negative samples and minimizing the cross entropy loss between the outputs(pos/neg) with their corresponding targets. However, the test accuracy ALWAYS seems 50% (i.e 50% of the test sets classified as correct, presumably the positive samples) and I am not sure how to trouble shoot this issue :frowning:

I was wondering of the pytorch community had any tips for me to trouble shoot this as I seem lost!

Can you please put your code here? Then we can help more.

1 Like