Reparameterization trick

I have a complicated computational graph which computes the likelihood of a dataset sampled from a parameterized probability distribution. I try to infer the parameter values by following gradient ascent for this likelihood. Computation of the gradient wrt one of the parameters requires the application of the reparameterization trick. Will that be automatically applied during the computation of the gradients by pytorch?

1 Like

If you use your_distrib.rsample(), yes. See an example here:


Thanks for the reply.
Could someone point me to more examples of stochastic computation graphs implementations in pytorch?