Autograd gamma function

I am running a script using a reparameterisation trick on the Dirichlet distribution sampling from a gamma distribution I am using the torch.igammac(x,0) function. I am getting this error message:

Exception has occurred: RuntimeError       (note: full exception trace is shown but execution is paused at: _run_module_as_main)
the derivative for 'igammac: input' is not implemented.

Getting the derivative of the gamma function shouldnt be so difficult. so why is it not implemented? Is it easy to just add it myself?


It is indeed not implemented right now.
If you have the formula for it, you can send a PR that adds it in our file that define derivatives here: pytorch/derivatives.yaml at de5e3b5eb045829041144e1c4e44448d13313d74 · pytorch/pytorch · GitHub and we will be happy to merge that! (You can add me as a reviewer)

derivative issue aside, it is odd that you need the gamma function for that, it should be enough to calculate fractions of GammaDistribution(a,1) samples. torch._standard_gamma(a) sampling is differentiable (by using reparameterisation I assume).

that actually seems to work. I was using a uniform proposal and the inverse cdf method to obtain may sample. I actually didn’t know, that the torch.distribution… had grad built-in.