Hello, this is the first time I implement BCElogitLoss and I was wondering if my input is correct or not because I experienced a sudden spike of loss in during my training for classification loss. I read from Logit Explanation that the input should be [-inf, inf] and I check my input is something like this :
torch.Size([1, 4]) tensor([[-1376.6078, -2134.1909, -1130.7600, -517.1730]], device='cuda:0',
grad_fn=<ViewBackward>)
torch.Size([1, 4]) tensor([[-1015.1113, -2017.0060, -598.0123, -647.5376]], device='cuda:0',
grad_fn=<ViewBackward>)
torch.Size([1, 4]) tensor([[ -948.6944, -2063.7595, -120.7232, -31.2307]], device='cuda:0',
grad_fn=<ViewBackward>)
torch.Size([1, 4]) tensor([[-1494.8126, -2984.7998, -173.2264, -605.0916]], device='cuda:0',
grad_fn=<ViewBackward>)
torch.Size([1, 4]) tensor([[ -759.2620, -6767.8813, -6867.0396, 155.1411]], device='cuda:0',
grad_fn=<ViewBackward>)
torch.Size([1, 4]) tensor([[-1216.1967, -1960.8824, 781.1366, -871.1536]], device='cuda:0',
grad_fn=<ViewBackward>)
and my target is something like this :
tensor([[0., 0., 1., 0.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 0., 1.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 1., 0., 0.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 0., 1.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 0., 1.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 0., 1.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 1., 0.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 0., 1.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 1., 0.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 0., 1.]], device='cuda:0', grad_fn=<FloorBackward>)
tensor([[0., 0., 1., 0.]], device='cuda:0', grad_fn=<FloorBackward>)
So is this the correct input format? if it is wrong, can you provide example of how the input should look like. If anyone is wondering, my last layer is just a simple conv2d and then I change the shape into [1,4] just like I show above. Thank you