All parameters not updated in model.parameters

class NN(pl):
    def __init__(self,input_dim,num_units1,num_units2,output):
        
        super(NN,self).__init__()
        self.w=[]
        for i in range(input_dim):
            self.w.append(Variable(torch.rand((1,),requires_grad=True)))
        
        self.w=Variable(torch.reshape(torch.tensor(self.w,dtype=torch.float32),(1,input_dim)),requires_grad=True)
        

     def forward(self):
               .........
               .........

When I ran model.paramters() I can’t see the self.w in the learnable parameters. How do I make the self.w as learnable parameters?

you need to set self.w as a nn.ParameterList rather than a List, which will not be counted when calling model.parameters()