x = torch.randn((10, 20))
mask = torch.bernoulli(torch.full(size=(10,), fill_value=0.50)).type(torch.bool)
i = 0
oh = F.one_hot(1, num_classes=20).type(torch.float)
When I do
x[mask][i] = oh
I’m hoping that one of the line of x is replaced by oh but it’s not the case, the change is not kept why? Do you have an alternative for it to be the case
I found a way to make it work but it’s not pretty:
Just adding an additional note to @ptrblck’s answer, I think x[mask] is a type of advanced indexing and according to PyTorch documentation, this indexing returns a copy of the tensor rather than a view of the underlying tensor.
When accessing the contents of a tensor via indexing, PyTorch follows Numpy behaviors that basic indexing returns views, while advanced indexing returns a copy.
Thus x[mask][i] = oh replaces the elements in the copy of the tensor rather than the view of the underlying tensor. So, x is unmodified.