How to know if gradients backpropagate through a pytorch function?

For the pytorch functions:

  • torch.special.erf() which is used in torch.distributions.normal.Normal().cdf()
  • torch.special.erfinv() which is used in torch.distributions.normal.Normal().icdf()
  • torch._standard_gamma() which is used in torch.distributions.gamma.Gamma().rsample()

I’m wondering if gradients can actually backpropagate through these? It appears that they each have their respective grad_fn’s, however, for something like torch._standard_gamma(“parameter”), I thought this was a completely stochastic node (statistically I’m not sure how you could reparametrize this), and so I thought can’t backpropagate through “paramete”.

As a more general question, I’m curious on how to know if gradients can backpropagate through a Pytorch function?

I appreciate any help! I’m doing some augmentations with VAEs for my thesis, so I want to make sure this works.