a_d
1
Hello all,
I just got started with complex tensors and tried assigning a complex tensor of shape (2,2) and it runs out of memory. My code is -
import torch
import torch.nn as nn
a = torch.rand(size=(2,2), dtype=torch.complex32)
print(a)
and the output error is -
Traceback (most recent call last):
File "/home/atharva/PycharmProjects/DeepComplexNetwork/main.py", line 4, in <module>
a = torch.rand(size=(2,2), dtype=torch.complex32)
RuntimeError: [enforce fail at CPUAllocator.cpp:64] . DefaultCPUAllocator: can't allocate memory: you tried to allocate 34359738368 bytes. Error code 12 (Cannot allocate memory)
I was just wondering if complex tensor of type complex32
even of size (2,2) 34359738368 bytes
big ?
TIA…
a_d
2
I just realized that it works with complex128 and comples64 but not with complex32… complex32 tensors are bigger?
albanD
(Alban D)
3
Hi,
Thanks for the report! This definitely looks like a bug: https://github.com/pytorch/pytorch/issues/43143
Note that complex32 is most likely not ready for general use.
Also could you share on the issue your pytorch version and how you installed it please?
a_d
4
Hi
Sorry for the late response… pytorch version is 1.6.0 and it was installed simply through pip3 - pip3 install torch torchvision
albanD
(Alban D)
5
Hi,
Yes we were able to reproduce it and it should be fixed now if you use the nightly build 