An issue about random seed

I’m having an issue, but I’m not sure whether it’s about pytorch

I have fixed the random seed by the following code (at the entrance of the program, or even at the beginning of each file)

seed = 2022
random.seed(seed)
np.random.seed(seed)
torch.manual_seed(seed)
torch.cuda.manual_seed(seed)
torch.cuda.manual_seed_all(seed)
torch.backends.cudnn.deterministic = True
os.environ['PYTHONHASHSEED'] = str(seed)
 Stochastic(nn.Module):
    """
    Base stochastic layer that uses the
    reparametrization trick [Kingma 2013]
    to draw a sample from a distribution
    parametrised by mu and log_var.
    """
    def reparametrize(self, mu, log_var):
        epsilon = Variable(torch.randn(mu.size()), requires_grad=False)

        if mu.is_cuda:
            epsilon = epsilon.cuda()

        # log_std = 0.5 * log_var
        # std = exp(log_std)
        std = log_var.mul(0.5).exp_()

        # z = std * epsilon + mu
        z = mu.addcmul(std, epsilon)

        return z

And I use the above code from others in my program

And I can’t reproduce the result stably, because I found in the above code that the random values in this line 8 are random each run. I’m confused about that.

I’m not sure whether Variable randn and requires_grad are being used correctly here

Looking forward to your answers, thanks in advance

Tried running your sample code but I am able to get the same value of z for different runs.

Why do you intend to use nn.Module is none of the parameters are trainable? Wouldn’t a simple function called reparameterize suffice which returns z and you won’t have to state if epsilon is trainable or not.

That is someone else’s program, I use the FastAPI framework to encapsulate and provide api interface, so I now suspect that is the problem about the FastAPI Background Task, but I do not know how to debug, confused…