Hello Forum!

There are useful functions of a complex variable, `z`

, that are not

“complex-differentiable,” that is, they are not *analytic* functions.

Two examples would be `z.conj()`

and `|z|**2 = z * z.conj()`

.

Note that these functions are differentiable when viewed as real

functions of two real variables, `x`

and `y`

, where `z = x + yj`

.

I think that when a function is analytic, autograd should return

its conventional complex derivative. However, we would like

autograd also to do something “useful” for non-analytic functions

that are differentiable when understood as real functions.

For example, it would be nice if gradient descent would work for

minimizing `|z - z0|**2 = (z - z0) * (z - z0).conj()`

with

respect to `z`

(which takes on its minimum value of `0`

when `z = z0`

).

I don’t have a well-thought-out proposal for how to do this. But I

came across this exposition that discusses some of the core issues:

The Complex Gradient Operator and the CR-Calculus

One facet of this issue is illustrated in this post where naive gradient

descent is shown to fail for `|z|`

(for complex `z`

) using autograd in

version 1.6.0:

Best.

K. Frank