The current nightly build, version 1.8.0.dev20201208, gives the
wrong autograd result for the square of a complex tensor. This
very simple case is handled correctly in version 1.6.0.
A quick look at github didn’t turn up any issues that seemed
directly relevant to this change.
z, the complex derivative of its square is given by
d z*z / d z = 2 * z
(just as it would be if
z were real). It appears that the current
nightly build incorrectly returns
2 * z.conj() as the derivative.
Here are the new and old pytorch results:
>>> import torch >>> torch.__version__ '1.8.0.dev20201208' >>> z = torch.tensor ([2. + 1.j], requires_grad = True) >>> zsq = z * z >>> zsq.backward() >>> z.grad tensor([4.-2.j])
(The result for
z.grad should be
2 * z = tensor([4.+2.j]).)
>>> import torch >>> torch.__version__ '1.6.0' >>> z = torch.tensor ([2. + 1.j], requires_grad = True) >>> zsq = z * z >>> zsq.backward() /home/user/miniconda3/lib/python3.8/site-packages/torch/autograd/__init__.py:125: UserWarning: Complex backward is not fully supported yet and could lead to wrong gradients for functions we have not fixed yet (Triggered internally at /opt/conda/conda-bld/pytorch_1595629395347/work/torch/csrc/autograd/python_engine.cpp:157.) Variable._execution_engine.run_backward( >>> z.grad tensor([4.+2.j])
z.grad is correct.)
The same thing happens using
z.pow (2.0) in place of
z * z.
In the following post, I link to a potentially-relevant discussion about
generalized complex differentiation.