Run out of memory when subtracting medium sized tensors?

I’m trying to subtract two tensors, which I don’t think should be prohibitively large…

uv.shape = torch.Size([256, 256, 2])
xy.shape = torch.Size([40, 28672, 3])

When I try to perform a simple subtraction, my kernel restarts by itself. I’m guessing it’s a memory issue, but don’t see why my computer can’t handle it as I have 128GB RAM. What could be triggering this problem?

uv[None,None,:,:,0]-xy[:, :,None,None,0]

Based on the posted operation your output would have the shape [40, 28672, 256, 256], thus 75161927680 elements.
If you are using float32 (the default dtype) this would use 40 * 28672 * 256 * 256 * 4 / 1024**3 = 280GB.

Thanks, that makes sense. Do you have any suggestions on how to perform simple math operations on large tensors?

Your code would work assuming your system has enough memory.
If that’s not the case you might need to reduce the input shapes or check if this output shape is even expected.