Why the following code:
from torch.autograd import Variable
import torch
print(torch.__version__)
# 0.2.0_1
x = torch.rand(4, 6)
x = Variable(x)
y = torch.rand(4*6)
y = Variable(y)
mask = x > 0.5
def fun(y, mask):
y = y.view(4, 6)
y[mask] = -1.0
fun(y, mask)
gives this error:
RuntimeError: in-place operations can be only used on variables that don't share storage with any other variables, but detected that there are 2 objects sharing it
But when I don’t use reshaping in fun
there is no error.
Also there is no error if I use y = y.view(4, 6).clone()
.