Complex numbers and backward()

I have complex data which I would like to use to train the network in the torch. I have managed to create a net and all tensors.

However, as soon as I start training, I get the following error: grad can be implicitly created only for real scalar outputs but got the torch.complex64

This happens during the call: loss.backward()

I wonder, is it possible to use the torch to perform training with complex data? If so, does anybody have experience with what has to be adjusted?

I wonder, is it possible to use the torch to perform training with complex data? If so, does anybody have experience with what has to be adjusted? Or how to reformulate the problem?

Judging by your error, your loss might be a non real value. So this may be an issue with your loss function. At the end of the day, you need to find some real positive value for the loss in order to do backprop. This is the case whether you are using real, complex, quaternions, etc. So you may need to look at your loss function and consider how to arrange it such that you get a real loss output.

Below is just one example that works:

import torch
import torch.nn as nn

dummy_inputs = torch.rand((10, 15), dtype = torch.cfloat)

model = nn.Linear(15, 1, dtype = torch.cfloat)
optimizer = torch.optim.SGD(model.parameters(), lr = 0.001)

def criterion(x, y):
    dist = torch.sqrt((x**2+y**2))
    return torch.abs(torch.mean(dist))

optimizer.zero_grad()
outputs = model(dummy_inputs)
loss = criterion(outputs, torch.rand_like(outputs))
loss.backward()
optimizer.step()

By no means am I saying the above loss function is the ideal one for your case. You may wish to read some papers on the subject to determine what loss functions have been proposed in the past with some success, and then tailor the loss function around your needs.

thanks! that was indeed a problem.

1 Like