Torch.mul RuntimeError: CUDA out of memory

Hi,
v: torch.Size([16, 1, 1024, 32768])
vp :torch.Size([1, 16, 1024, 32768])
c=torch.mul(v,vp) torch.mul RuntimeError: CUDA out of memory
how I can fix this error?

thanks

Each of these Tensors are quite big.
When you do torch.mul(), it allocates a new Tensor to store the result, but that goes above the memory limit of your GPU. I guess trying to create a Tensor of the size equal to the result will fail: res = torch.empty(16, 16, 1024, 32768).

Hi @albanD
I have the same error when I try:res=torch.mul(v,vp) such as res = torch.empty(16, 16, 1024, 32768)

So the issue is just that your don’t have enough memory to store that Tensor. I’m afraid you will need to find a way with not representing the full Tensor, or having a smaller one. Or buy a gpu with more memory.