Optimizer got an empty parameter list in a hand made code

Hi! I got the following error ValueError: optimizer got an empty parameter list when I run the code containing the following classes:

class Mass_and_v():
    def __init__(self, param):
        self.parameter = param
        
    # get_mass + v_circ_tot
    def __call__(self, x):
        ener_f = 56.0 # keV
        r, mass = rar.model(self.parameter[3:].detach().numpy(),ener_f)
        isnan = np.argwhere(np.isnan(mass))
        if (np.any(isnan)):
            np.argwhere(np.isnan(mass))[0][0]
        else:
            k = -1
        r_s = r[0:k]
        mass_s = mass[0:k]
        mass_spline = InterpolatedUnivariateSpline(r_s, mass_s, k=4)
        r_max = np.amax(r_s)
        if x <= r_max:
            return np.sqrt(v_circ_b(x,self.parameter[0].detach().numpy())**2 + v_circ_MN(x,self.parameter[1:3].detach().numpy())**2 + G_u*mass_spline(x)/x)
        else:
            return np.sqrt(v_circ_b(x,self.parameter[0].detach().numpy())**2 + v_circ_MN(x,self.parameter[1:3].detach().numpy())**2 + G_u*mass_spline(r_max)/x)

class FitModel(nn.Module):
    def __init__(self):
        super(FitModel, self).__init__()
        self.nonlinear_stack = Mass_and_v(param_t)
        
    def forward(self, x):
        output = self.nonlinear_stack(x)
        return output

I got the error when the code comes to the line

optimizer = torch.optim.Adam(model.parameters(), lr=eta, eps=1e-08, weight_decay=0, amsgrad=False)

I hope someone can help me. Thank you so much in advance :slightly_smiling_face:

Mass_and_v is not an nn.Module class and also uses numpy arrays internally, which Autograd won’t be able to track.
Derive Mass_and_v from nn.Module and register the parameters as nn.Parameter in its __init__.

Great! That was very useful. Thank you!
So, let me see if I understand (I am kind of new in PyTorch), I need to do Mass_and_v(nn.Module) when I define that class. Also, I need to change the line self.parameter = param for self.parameter = nn.Parameter(data=param, requires_grad=True). Is that right? Well, If I do this, now I get:

AttributeError: cannot assign parameters before Module.init() call

Apart from all of these, do you suggest that I have to code the Mass_and_v class all based in PyTorch, not using numpy?

The AttributeError is raised, as you’ve most likely forgotten to add the super().__init__() call into the __init__ method.

Yes, I would try to avoid using other libraries if possible, since you would break the computation graph and PyTorch won’t be able to track these operations. If really necessary, you could write a custom autograd.Function and define the forward as well as backward methods manually.

Great! I understand. Thank you so much for your help :slightly_smiling_face: