Is there a hankel function in the pytorch?

I want to use hankle function with torch that can through grid. There only have first kind of bessel function and haven’t second kind of bessel function and hankel function. I don’t if it is right. Thank you very munch.

Hi Dong!

It appears that pytorch does implement bessel functions of the second
kind for orders 0 and 1 as torch.special.bessel_y0() and
torch.special.bessel_y1(), but only for real arguments and
differentiation (autograd) is not supported. (The torch.special documentation
seems not to mention y0() and y1().) I do not believe that pytorch currently
implements hankel functions themselves.

Here’s an illustration for the current stable version, pytorch 2.0.1:

>>> import torch
>>> torch.__version__
>>> tc = torch.tensor ([1.23], requires_grad = True)
>>> tg = torch.tensor ([1.23], requires_grad = True, device = 'cuda')
>>> torch.special.bessel_j0 (tc)
>>> torch.special.bessel_j1 (tc)
>>> torch.special.bessel_y0 (tc)
>>> torch.special.bessel_y1 (tc)
>>> torch.special.bessel_j0 (tg)
tensor([0.6561], device='cuda:0')
>>> torch.special.bessel_j1 (tg)
tensor([0.5058], device='cuda:0')
>>> torch.special.bessel_y0 (tg)
tensor([0.2464], device='cuda:0')
>>> torch.special.bessel_y1 (tg)
tensor([-0.5990], device='cuda:0')

Note the absence of a grad_fn for the results of the bessel functions.

For some context, see this github issue:


K. Frank

Hi K.Frank!
Thank you for your helping. Yeah,the bessel functions of the second
kind do not support differentiation. Maybe using the part of the scipy.special in the tensowflow can do differentiate.:thinking:


Hi Dong!

You can wrap scipy.special’s hankel function in a custom autograd function.
Put, for example, hankel1() in your custom function’s forward() method
and implement the rather straightforward expression for the derivative of the
hankel function (see, for example, MathWorld), again using scipy, in the custom
function’s backward() method.

Good luck!

K. Frank