# View tensor as larger tensor without copying memory

Hi all,

I have 2 tensors. They are quite large, so for the sake of simplicity i will give a simplified example. The small tensor is half the size of the large tensor in all dimensions (2,2) vs (4,4)

``````small = [[1,2],[3,4]]
large = [[1,2,3,4],[5,6,7,8],[9,10,11,12],[13,14,15,16]]
``````

I need to do a comparison between the 2. For the sake of this comparison, each item in the small tensor needs to be repeated so that it fills double the space in each dimension. It is basically a Nearest Neighbour upsampling. The catch is, if possible, i do not want to copy the memory, since these are really big tensors.

``````desired_small_tensor = [[1,1,2,2],[1,1,2,2],[3,3,4,4],[3,3,4,4]]
``````

If you not willing to upsampling small tensor, then you may need slicing your large tensor to match the small one. Here is sample code:

``````sizes = small.shape
out = torch.zeros_like(large, dtype=torch.uint8)

for i in range(2):
large_dim0 = large.narrow(0, i * sizes[0], sizes[0])
out_dim0 = out.narrow(0, i * sizes[0], sizes[0])
for j in range(2):
large_dim1 = large_dim0.narrow(1, j * sizes[1], sizes[1])
out_dim1 = out_dim0.narrow(1, j * sizes[1], sizes[1])

out_dim1[:] = small < large_dim1
``````

Thanks @Joel_Wu
I think the slicing will probably be slow, since each dimension has > 1000 elements. I’ll keep looking for a solution, but for now I’m upsampling the small one