and replace them directly in each iteration, which could (and in your case will) reuse the data_ptr.
If you print item.data_ptr() of x and y or item before the contiguous() call you would see that they are different.
Thanks! I have another question, which is how to set contiguous for some tensors? For example:
import torch
def set_contiguous(args_list: list):
for i in range(args_list.__len__()):
args_list[i] = args_list[i].contiguous()
device = 'cuda:0'
x = torch.rand([2, 3, 4], device=device).transpose(0, 2)
y = torch.zeros_like(x)
args_list = [x, y]
set_contiguous(args_list)
print(x.is_contiguous())
The outputs are: False
As you said, args_list[i] = args_list[i].contiguous() creates new tensor, and x, y are not set to contiguous tensors. How can I define a correct set_contiguous function?
Thanks! It seems hard to find a real ‘inplace’ function to make ‘x, y’ contiguous. Now I use the outputs to overlap inputs, which behaves like ‘inplace’:
import torch
def set_contiguous(*args):
ret = []
for item in args:
ret.append(item.contiguous())
return ret
device = 'cuda:0'
x = torch.rand([2, 3, 4], device=device).transpose(0, 2)
y = torch.zeros_like(x)
x, y = set_contiguous(x, y)
print(x.is_contiguous())