Hi, it seems that my very elementary complex exponential does not support Autograd :

z = torch.tensor([1.1-2j], requires_grad=True)
f = torch.exp(z)

gives me a RunTimeError :

---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-31-69abc9207ce8> in <module>
1 z = torch.tensor([1.1-2j], requires_grad=True)
----> 2 f = torch.exp(z)
RuntimeError: exp does not support automatic differentiation for outputs with complex dtype.

How can I use Autograd with complex functions in general, and with the elementary complex exponential in particular ?

In preparation for the 1.7 release and to avoid issues, we added error messages for all the functions that were not yet audited for complex autograd.
We are working on auditing the formulas and re-enabling them.

cc @anjali411 do we have an issue describing the process if people want to help here?

Hey albanD, I am using the stable Pytorch 1.7.1 build and when I try testing functions that are listed under the GRADIENT_IMPLEMENTED_FOR_COMPLEX the ‘does not support automatic differentiaion’ error still occurs for many of them. Should I instead use the nightly build to access these complex autogradients? Thanks