Training loss jumps up suddenly from nearly 1 to 39?

I’m trying to train a self-supervised model to detect human interactions. After creating the positive and negative pairs, when I run my training module I get a strange training loss.

For the first half of the batches in each epoch I get a loss between 0.5-1 however in the middle it suddenly jumps to values like 39.8 which is very weird. I am using a contrastive loss function. Is there any reason as to why this could be happening?

If you are not shuffling the dataset, you might encounter a “difficult” example in the same iteration, which would increase the loss. Could you shuffle it and check, if the behavior changes?

Hi, thank you for this! In fact, the reason for my loss being so high all of a sudden was because my loss function was contrastive loss for which I had set the default margin to 10 (really high in my use case!) but now I have another issue, my loss does not converge. I’ll try reshuffling the dataset and test out some other things.