Hello Schalky and Alban!

The short answer is that you can implement your own correct

complex norm (see below).

Yes, I think this is the core issue.

On 1.6.0 (current stable), I can take the norm of a complex vector,

but the result isn’t correct. (I haven’t tried 1.7.) Pytorch is using the

complex square, rather than `x * x.conj()`

.

This illustrates the issue, and shows how you can implement your

own correct version:

```
>>> import torch
>>> torch.__version__
'1.6.0'
>>> cvec = torch.randn (3, dtype = torch.cfloat)
>>> rvec = (cvec + cvec.conj()) / 2.0
>>> cvec
tensor([1.2428-0.6251j, 0.1556+0.8382j, 0.3308+0.8406j])
>>> rvec
tensor([1.2428+0.j, 0.1556+0.j, 0.3308+0.j])
>>> torch.norm (cvec) # incorrect, result should be real
tensor(0.5590-0.6590j)
>>> (cvec * cvec).sum()**0.5 # incorrect, no complex conjugate
tensor(0.5590-0.6590j)
>>> (cvec * cvec.conj()).sum()**0.5 # correct norm with complex conjugate
tensor(1.8650-5.5731e-09j)
>>> torch.norm (rvec)
tensor(1.2955+0.j)
>>> (rvec * rvec).sum()**0.5
tensor(1.2955+0.j)
>>> (rvec * rvec.conj()).sum()**0.5
tensor(1.2955+0.j)
```

Best.

K. Frank