Hello,
I am relatively new to PyTorch and distributions, so forgive any naïveté in the question.
I am attempting to employ PyTorch distributions in a Bayesian inferential framework. At this point I am successfully using Poisson and Gaussian distributions. However, one crucial part of my model/guide is a sort of mapping from a->a'
. By itself these two quantities can be arranged in a square 2D matrix with a
in rows and a'
in columns, as an example. Importantly, this matrix can be given as P(a'|a)
in its forward form or P(a|a')
in the backwards form (these two are easily computed from each other using Bayes Law).
I am having troubles implementing this a->a'
conversion in a way which can facilitate the back-propagation of loss error. In actuality, both a
and a'
are continuous variables, however, discretization of these two quantities with a integer granularity is reasonable for the purposes of the model.
I have come across the idea of TransformedDistributions, but I am uncertain as to their utility. I have also considered using some combination of Categorical distributions with little success.
Does anyone have any thoughts on how this could be achieved?
Thanks for any hints/tips/tricks!