Does volatile=True keep gradient intact?

My program flow is this:

model.train()
#train

model.eval()
x = Variable(input, volatile=True)
model(x)
#validation

optimizer.step()

Is gradient accumulated yet in inference mode with volatile flag set to True? From what I’ve understood volatile=True means that the computations (then gradient) and the hidden states computed from that input will be forgotten. Is it right?

EDIT: forget I asked this. :joy: I recall now that the gradient is accumulated only when I call backward and then I reset it at the start of the epoch. You can delete it. :wink:

volatile=True tells PyTorch to not bother keeping track of the computation graph.
If any of your inputs have volatile=True then you will very likely find that calling .backward() will result in an error.

1 Like