Backpropagation for complex tensors in Pytorch


I’m doing a project that aims to investigate complex NNs. I see that Pytorch has some implementations for complex NN layers, and would like to know how Pytorch handles the backpropagation for complex data (e.g. calculate derivatives, update parameters). It would be helpful if I could get some pointers to the codebase. Thank you!

Hope this helps