Assignment using a byte mask

Why the following code:

from torch.autograd import Variable
import torch
print(torch.__version__)
# 0.2.0_1

x = torch.rand(4, 6)
x = Variable(x)
y = torch.rand(4*6)
y = Variable(y)

mask = x > 0.5

def fun(y, mask):
    y = y.view(4, 6)
    y[mask] = -1.0
    
fun(y, mask)

gives this error:

RuntimeError: in-place operations can be only used on variables that don't share storage with any other variables, but detected that there are 2 objects sharing it

But when I don’t use reshaping in fun there is no error.
Also there is no error if I use y = y.view(4, 6).clone().

Try the following:

from torch.autograd import Variable
import torch
print(torch.__version__)
# 0.2.0_1

x = torch.rand(4, 6)
x = Variable(x)
y = torch.rand(4*6)
y = Variable(y)

mask = x > 0.5

def fun(y, mask):
    y = y.view(4, 6).clone()
    y[mask] = -1.0
    return y


fun(y, mask)

The problem is you can’t (yet) perform in-place operations on a view. My above snippet clones the view so that it is its own tensor.

1 Like