Help with with torch.set_grad_enabled(True):

I’m transitioning to 0.4 and I love the new syntax!

I have a bit of problem understanding how to use with torch.set_grad_enabled(True): to initialize tensors with grad enabled:

In [31]: with torch.set_grad_enabled(True):
    ...:   X = torch.tensor(10)
    ...:   y = X * 10
    ...:   

In [32]: X.requires_grad
Out[32]: False

In [33]: y.requires_grad
Out[33]: False

Shouldn’t both be True?

1 Like

I think the with: block restricts the setting to only apply within the block.

You can set requires_grad for a tensor with

X.requires_grad = True

or

X.requires_grad_()

1 Like

Ah ok … makes sense - thanks!

If you care for a bit of unsolicited advice: Note that X.reqires_grad = True will silently not do the expected while X.reqires_grad_() will tell you about your spelling. For that reason (and for saying y = x.detach().requires_grad_() and similar more compactly) I have moved to use the function exclusively.

Best regards

Thomas

2 Likes