January 20, 2017, 5:45pm
e.g., in Torch 7 I have a net with the last module an nn.ConcatTable, then I make the gradOutputs a table of tensors and do the net:backward(inputs, gradOutputs)
How to do similar things with pytorch? I tried to backward() for each output, but it complained that backward should not be called multiple times?
January 20, 2017, 5:57pm
I think you want
It takes a list of variables and a list of gradients (one for each variable).
January 21, 2017, 6:00am
Yes it is what I want! Thanks!
Hope this can be added to tutorial/examples as this kind of backward() is common for multi-task learning. I thought I had to write code like
variable.backward() as the
deep learning example.
January 21, 2017, 8:59am
Hi, the doc you pointed to has a minor problem: the second argument,
grad_variables, should be “sequence of tensors”, not “sequence of Variables”.
See your code here:
January 21, 2017, 12:33pm
Thanks for the report, we’ll fix that!