Loss goes to Zero in first epoch!

Hi,

I am getting a weird case in which my loss just vanishes within an epoch:


Train Epoch: 1 [0/33005 (0%)] Loss: 84108.492188

Train Epoch: 1 [32000/33005 (97%)] Loss: 0.000000

Epoch: 2 LR: [0.001]

Starting Training for Epoch 2

Train Epoch: 2 [0/33005 (0%)] Loss: 0.000000

Train Epoch: 2 [32000/33005 (97%)] Loss: 0.000000

Epoch: 3 LR: [0.001]

Starting Training for Epoch 3

Train Epoch: 3 [0/33005 (0%)] Loss: 0.000000

I am at loss for what is the issue, I tried different random initializations as well, same thing happens.

Any common tips on debugging the model?

Can you print your output to check if there are some unusual things or paste it to discuss?

Can you post model architecture. And also the training loop