There are useful functions of a complex variable,
z, that are not
“complex-differentiable,” that is, they are not analytic functions.
Two examples would be
|z|**2 = z * z.conj().
Note that these functions are differentiable when viewed as real
functions of two real variables,
z = x + yj.
I think that when a function is analytic, autograd should return
its conventional complex derivative. However, we would like
autograd also to do something “useful” for non-analytic functions
that are differentiable when understood as real functions.
For example, it would be nice if gradient descent would work for
|z - z0|**2 = (z - z0) * (z - z0).conj() with
z (which takes on its minimum value of
z = z0).
I don’t have a well-thought-out proposal for how to do this. But I
came across this exposition that discusses some of the core issues:
One facet of this issue is illustrated in this post where naive gradient
descent is shown to fail for
|z| (for complex
z) using autograd in