Two small questions about :" with torch.no_grad():"

Hi,

  1. You should actuall wrap the whole validation code (that requires no backward) within this torch.no_grad() block. Note that you can also use it as a decorator of your eval function:
@torch.no_grad()
def val(args):
    # You validation function
    return accuracy

1-bis In recent pytorch versions, Variable has been removed. You can remove all of them from your code. Tensors are the same as the old variables. They have a .requires_grad field to know if they require gradients and can take a requires_grad keyword argument on creation.

2 Yes this is exactly how you should do it:

with torch.no_grad():
    # No gradients in this block
    x = self.cnn(x)

# Gradients as usual outside of it
x = self.lstm(x)
2 Likes