Parameter is not getting iniatlized

class A(nn.Module):
    def __init__(self):
        super(A,self).__init__()
        self.p = nn.Parameter(torch.randn(1))

In def forward when I call print(self.p.shape, self.p), it is returning :

torch.Size([0]) Parameter containing:
tensor([], requires_grad=True)

So, self.p is not initialized. How to initialize this ? Please help.

It works for me.

import torch
import torch.nn as nn


class A(nn.Module):
    def __init__(self):
        super(A,self).__init__()
        self.p = nn.Parameter(torch.randn(1))
        
    def forward(self):
        print(self.p.shape, self.p)
        
a = A()
a()

which prints out:

torch.Size([1]) Parameter containing:
tensor([-0.3897], requires_grad=True).

1 Like

I dunno what exactly the error was, it didn’t run in the beginning , now its running. I’m getting errors if I dont remove A,self inside super(). Any idea why.

What kind of errors are you seeing if you keep super(A, self).__init__() and which Python version are you using?
The arguments were only necessary in Python2, if I’m not mistaken, but I haven’t seen it’s raising an error in Python3.