Modified bessel function arbitrary order

I am looking for a PyTorch analog to scipy.special.iv which excepts an arbitrary order as one of its arguments. I only see fixed 0th and 1st order implementations in torch.special’s docs. Does anyone know if this analog exists elsewhere?

Hi Wesley!

I am not aware of any pytorch implementation of higher Bessel functions.

If you need gradients (backpropagation), you could wrap scipy’s iv() in
a custom autograd Function. You would (presumably using scipy again)
implement the derivative of iv() in your Function’s backward() method.

This would come at the cost of not using the gpu for your Bessel-function
computations. (For convenience, you could have your custom Function
move relevant tensors from the gpu to cpu as necessary so that it would
work with gpu tensors, but the computations would still be being done on
the cpu.)


K. Frank

1 Like

Thank you. I think I need to stick with an implementation that will allow GPU usage. I’m trying to implement a power series expansion in PyTorch instead. I’m not sure how well batching is going to work out though. If that doesn’t work I’ll have to get help from our resident C++ expert to help me dive under the hood.