How to initialize Variables in PyTorch?

How do I initialize a layer if it is not constructed using nn.Linear for example?

self.nlc =Variable(torch.randn(10,1,1,1,1).type(torch.FloatTensor), requires_grad=True)

Is there like a method below?

self.nlc =Variable(torch.glorot(10,1,1,1,1).type(torch.FloatTensor), requires_grad=True)

Variables are deprecated since version 0.4.0, so you can just use tensors not (ans set requires_grad=True in the initialization).
torch.nn.init provides various methods to initialize your parameters. :wink:

Sorry, but could you give an example? Do you create an empty tensor then initialize it like below? Is it correct?

A = torch.empty(5, 7, requires_grad=True).type(torch.cuda.FloatTensor)
torch.nn.init.normal_(A)
1 Like

Your example is correct.
Note that you should wrap this tensor in nn.Parameter, if you would like to optimize it inside an nn.Module. nn.Parameters will be automatically registered inside modules, if you use assign them as attributes:

class MyModule(nn.Module):
    def __init__(self):
        super(MyModule, self).__init__()
        A = torch.empty(5, 7, device='cpu')
        self.A = nn.Parameter(A)
        
    def forward(self, x):
        return x * self.A

module = MyModule()
print(dict(module.named_parameters()))
> {'A': Parameter containing:
tensor([[-7.8389e-37,  3.0623e-41, -7.8627e-37,  3.0623e-41,  1.1210e-43,
          0.0000e+00,  8.9683e-44],
        [ 0.0000e+00, -7.8579e-37,  3.0623e-41,  1.4013e-45,  0.0000e+00,
          0.0000e+00,  0.0000e+00],
        [ 0.0000e+00,  0.0000e+00,  0.0000e+00,  0.0000e+00,  0.0000e+00,
          0.0000e+00, -7.7193e-37],
        [ 3.0623e-41,  1.8077e-43,  0.0000e+00,  4.7530e-06,  4.5845e-41,
         -7.8459e-38,  3.0623e-41],
        [ 0.0000e+00,  0.0000e+00,  1.3593e-43,  0.0000e+00, -7.9340e-37,
          3.0623e-41, -7.8739e-37]], requires_grad=True)}

By wrapping them in nn.Parameter, the requires_gradient attribute will be set to True by default.

Let me know, if you have more questions or something is unclear.

3 Likes

That is much better thank you very much !