[DUPLICATED] SigmoidTransform(Normal()).log_prob() sometimes outputs values larger than zero

Hello, I found the distribution.log_prob() sometimes outputs values larger than 0 while trying to use TransformedDistribution.

>>> torch.__version__
'0.4.0a0+3749c58'
>>> from torch.distributions import Normal, SigmoidTransform, TransformedDistribution
>>> m=TransformedDistribution(Normal(torch.tensor([0.]), torch.tensor([1.])), [SigmoidTransform()])
>>> m.sample()
tensor([ 0.5783])
>>> m.sample()
tensor([ 0.8666])
>>> m.log_prob(m.sample())
tensor([ 0.2810])
>>> m.log_prob(m.sample())
tensor([-0.6253])
>>> m.log_prob(m.sample())
tensor([ 0.3911])
>>> m.log_prob(m.sample())
tensor([-1.2592])
>>> m.log_prob(m.sample()).exp()
tensor([ 1.4093])
>>> m.log_prob(m.sample()).exp()
tensor([ 0.5151])
>>> m.log_prob(m.sample()).exp()
tensor([ 1.4236])
>>> m.log_prob(m.sample()).exp()
tensor([ 1.4300])
>>> m.log_prob(m.sample()).exp()
tensor([ 1.5884])

Is it normal behavior to output values where log_prob(val).exp() is larger than 1?
If it is, what does the log_prob mean?

Thank you.

I am sorry, but I found this issue: https://github.com/pytorch/pytorch/issues/7637
I thought that it is an issue for Transform, sorry for duplicate.