What is the equivalent of torch.no_grad() and model.eval() in torch 0.3?


I am working on someone’s code which is written in torch 0.3. I realized torch 0.3 doesn’t support torch.no_grad() or model.eval() in this regard, I wonder how I can make sure that no back propagation is being done while doing the validation on dev or test set?


Variable(..., volatile=True) for inputs

1 Like