Error about backward

Hi, I got the following error. Is there any clue what would cause 0 elements in grad_variable? Many Thanks

Traceback (most recent call last):
  File "", line 459, in <module>
  File "", line 178, in main
    train(args, net, optimizer, criterion, scheduler)
  File "", line 280, in train
  File "/home/rusu5516/.local/lib/python3.5/site-packages/torch/autograd/", line 167, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
  File "/home/rusu5516/.local/lib/python3.5/site-packages/torch/autograd/", line 99, in backward
    variables, grad_variables, retain_graph)
RuntimeError: invalid argument 2: size '[1]' is invalid for input with 0 elements at /pytorch/torch/lib/TH/THStorage.c:41

This looks like a pytorch bug. Could you post a script that reproduces this?

Were you able to solve the problem? I am also stuck at this same problem.