To make a transformed distribution I want to implement an inverse softplus function
To do this i would need to implement something like this
if x > 20:
return (x.exp() - 1).log()
torch.nn.functional.threshold would be promising, except it doesn’t support tensors in its
torch.where looks good too, but I would prefer to have a
0.3 compatible solution.
cond * (...) + (1-cond) * (...) hack used in the below thread doesn’t work here since the
(...)'s may be infinite.
You could clip the term somewhere outside the range and then multiply, even if it isn’t the most efficient solution.
Thanks! That’s a good idea.
For the record, I ended up installing pytorch 0.4, which was actually a delight to compile.
Here’s the code:
import torch.nn as nn
from torch.distributions import constraints
from torch.distributions.transforms import Transform
Transform via the mapping :math:`y = \log(1 + \exp(x))`
domain = constraints.real
codomain = constraints.positive
bijective = True
sign = +1
self.softplus = nn.Softplus()
self.threshold = 20
self.log_sigmoid = nn.LogSigmoid()
def _call(self, x):
def _inverse(self, y):
return torch.where(y>self.threshold, y, y.expm1().log())
def log_abs_det_jacobian(self, x, y):
import pyro.distributions as dist
import seaborn as sns