@SimonW Iām getting the error āAttributeError: āVariableā object has no attribute āas_stridedāā although I have the latest PyTorch version. What is the problem?
oh i see⦠Yeah, it might not be available in 0.3.1. Before the next release, you can try advanced indexing A[[1,2,3], [1,2,3]]=4 but that might break backward if some graph depends on the original overwritten values. Or you can multiply by a matrix then add another matrix⦠Yeah, these are not idealā¦
The solution with advances indexing is the way to go for 0.3.1 I think.
Keep in mind that inplace operations are not always possible when working with Variables because the original value might be needed to compute the backward pass.
It will affect it because the diagonal values of the original matrix are not used to compute the output anymore.
See the code sample below:
import torch
from torch.autograd import Variable
size = 10
full = Variable(torch.rand(size, size), requires_grad=True)
new_diag = Variable(torch.rand(size), requires_grad=True)
# Do this because we cannot change a leaf variable inplace
full_clone = full.clone()
full_clone[range(size), range(size)] = new_diag
full_clone.sum().backward()
# Should be a diagonal of 0 and 1 for the rest
print(full.grad)
# Should be full of 1
print(new_diag.grad)
Ok, not sure that makes senseā¦
But here is the code to do it
import torch
from torch.autograd import Variable
size = 10
full = Variable(torch.rand(size, size), requires_grad=True)
new_diag = Variable(torch.rand(size), requires_grad=True)
# Do this because we cannot change a leaf variable inplace
full_clone = full.clone()
# WARNING: using data here will break the graph and this
# operation will not be tracked by the autograd engine.
# Hence giving "wrong" gradients
full_clone.data[range(size), range(size)] = new_diag.data
full_clone.sum().backward()
# Should be full of 1
print(full.grad)
# Should be None (equivalent to full of 0)
print(new_diag.grad)
If the backward doesnāt need content of that cov matrix you have, then just modifying inplace is fine. (Run .backward to find out.) Otherwise, you can do a clone and then modify inplace.