Log probability of RelaxedBernoulli

Here is a small snippet of code which initializes a relaxed bernoulli with temperature 0.1 and relaxed bernoulli parameter 0.8:

import torch
dist =  torch.distributions.relaxed_bernoulli.RelaxedBernoulli(torch.tensor(0.1), probs = torch.tensor([0.8]))
samp = dist.rsample()
samp_prob = torch.exp(dist.log_prob(samp))

This most of the times gives probability much greater than 1. What am I missing here?