The issue here is not with the fft but the torch.prod operation that does not support autograd for complex inputs at the moment.
Complex autograd is work in progress but you can open a feature request on github if you need it now.
Thank you for your reply. I should have been more clear in formulating the problem.
I’m well aware that the problem is with prod, I only included the fft part to show the context – perhaps someone smart had an idea how to get around the problem knowing the context. I have a hunch that the complex dtype could be avoided entirely. Perhaps.