In my model, I want to first forward some input and calculate and store some immediate results, then use these results in the second forward pass and update the model only with the gradient calculated in the second pass. But I got this error:
Specify retain_graph=True when calling backward the first time. My code is below. I change
requires_grad before and after the first forward pass to get rid of this error, but this seems like some ugly hacking, is there more elegant way to do this?
for p in net.parameters(): p.requires_grad = False net(input1) for p in net.parameters(): p.requires_grad = True output = net(input2) loss = criterion(output,label) loss.backward()