I’m having an issue, but I’m not sure whether it’s about pytorch
I have fixed the random seed by the following code (at the entrance of the program, or even at the beginning of each file)
seed = 2022
random.seed(seed)
np.random.seed(seed)
torch.manual_seed(seed)
torch.cuda.manual_seed(seed)
torch.cuda.manual_seed_all(seed)
torch.backends.cudnn.deterministic = True
os.environ['PYTHONHASHSEED'] = str(seed)
Stochastic(nn.Module):
"""
Base stochastic layer that uses the
reparametrization trick [Kingma 2013]
to draw a sample from a distribution
parametrised by mu and log_var.
"""
def reparametrize(self, mu, log_var):
epsilon = Variable(torch.randn(mu.size()), requires_grad=False)
if mu.is_cuda:
epsilon = epsilon.cuda()
# log_std = 0.5 * log_var
# std = exp(log_std)
std = log_var.mul(0.5).exp_()
# z = std * epsilon + mu
z = mu.addcmul(std, epsilon)
return z
And I use the above code from others in my program
And I can’t reproduce the result stably, because I found in the above code that the random values in this line 8 are random each run. I’m confused about that.
I’m not sure whether Variable
randn
and requires_grad
are being used correctly here
Looking forward to your answers, thanks in advance