I am trying to used complex valued data as input to a test neural network. From the release notes, PyTorch 1.8.0 is said to support complex autograd. My code is as follows.
import torch
from torch import nn, optim
class ComplexTest(nn.Module):
def __init__(self):
super(ComplexTest, self).__init__()
self.fc1 = nn.Linear(10, 20)
self.fc2 = nn.Linear(20, 10)
self.relu = nn.ReLU()
def forward(self, inputs):
return self.fc2(self.relu(self.fc1(inputs)))
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
complex_test = ComplexTest().to(device)
complex_test.train()
opt = optim.Adam(complex_test.parameters())
mse_loss = nn.MSELoss()
for _ in range(100):
opt.zero_grad()
inp = torch.randn((1000, 10), dtype=torch.cfloat).to(device)
op = complex_test(inp)
loss = mse_loss(op, inp)
loss.backward()
opt.step()
print(loss.item())
But this gives an error
expected scalar type Float but found ComplexFloat
Is this not supported or did I read the documentation wrong? Thanks!
@omarfoq, thank you for your response! I tried adding to(torch.cfloat) like you mentioned and my code looks as below.
import torch
from torch import nn, optim
class ComplexTest(nn.Module):
def __init__(self):
super(ComplexTest, self).__init__()
self.fc1 = nn.Linear(10, 20).to(torch.cfloat)
self.fc2 = nn.Linear(20, 10).to(torch.cfloat)
self.relu = nn.ReLU()
def forward(self, inputs):
return self.fc2(self.relu(self.fc1(inputs)))
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
complex_test = ComplexTest().to(device)
complex_test.train()
opt = optim.Adam(complex_test.parameters())
mse_loss = nn.MSELoss()
for _ in range(100):
inp = torch.randn((1000, 10), dtype=torch.cfloat)
op = complex_test(inp)
loss = mse_loss(op, inp)
loss.backward()
opt.step()
print(loss.item())
Now the layers do not have problem with the complex valued input. But I could not figure out how to do this for activation function. The following error thereby pops up.
âthreshold_cpuâ not implemented for âComplexFloatâ
ReLU is in fact a function based on comparing a real number with 0, and as there is no natural order for complex numbers, doing the max between a complex number and zero is not well defined, and so is ReLU for complex numbers.
Depending on what problematic you are working, I think you can define your own activations or use activations that are adapted to complex numbers (this includes all holomorphic functions)