nn.Linear as forward function complex neural network

Hey guys!

I am running into a problem when using the complex library. I am trying to write a little training program in which I am using torch.nn.Linear as a forward function. The input data I am working with, however, can be complex.
Therefore in torch.nn.Linear I end up trying to do matmul with complex valued tensors, which is not allowed (under the release 1.6.0). The reason that I end up in a situation in which I want to perform complex valued tensors is that my cost function is complex which leads to complex gradients and hence a possibly complex weights.
Can someone tell me if there is perhaps another forward function I could use with which I would not run into these problems? Or can someone give me some insight in another way in which I could write little neural network with a complex cost function?

1 Like

If you don’t mind me asking, what does it mean to have a complex cost function (i.e. how do you “minimize”)?
One obvious way could be to feed the real and imaginary into the network separately, it would seem PyTorch still needs a bit before supporting complex for everything. Matmul should work with the preview nightly but not Linear, so you could define your own linear if you want.

Best regards

Thomas

So I am minimising over the difference in probality of making a certain measurement (in the learnt and optimal case). This probability is determined by both the imaginary and real part of the amplitude. And the values that I am optimising over influence the final state of the system in a way determined by complex functions. So there is no way (to my knowledge) to rewrite the system in such a way that I can get rid of these complex values here without loosing a lot of information.

I will have a look at the preview nightly and perhaps define my own linear.

Thanks for your help,
Merel