How to forward twice but only backward once using gradient from second forward?

In my model, I want to first forward some input and calculate and store some immediate results, then use these results in the second forward pass and update the model only with the gradient calculated in the second pass. But I got this error: Specify retain_graph=True when calling backward the first time. My code is below. I change requires_grad before and after the first forward pass to get rid of this error, but this seems like some ugly hacking, is there more elegant way to do this?

   for p in net.parameters():
       p.requires_grad = False
    net(input1)

    for p in net.parameters():
        p.requires_grad = True
    output = net(input2)

    loss = criterion(output,label)
    loss.backward()
1 Like

If output of first fwd pass is not used in second pass, then couldn’t you just switch orders? Or use volatile=True?

Thanks for the response. It’s like the neural style transfer algorithm, some internal results need to be saved in the first forward pass.

Do you need the results in second pass? Do you need to backward through them?