Hi, I am writing a PyTorch program on cross-domain recommendations.
I would like to summarise my model as
input: users and items interacted, retrieve embeddings, pass it through the model, and get the output.
You need to give your Module’s parameters to the optimizer, not the Module itself. In particular you can use self.U.parameters() here.
Or if self is already a Module itself, self.parameters().
self is not a module in my case, I have tried using self.U.parameters() and the error is now
“optimizer can only optimize Tensors, but one of the params is Module.parameters”
Do you have any idea why is this so?
I was trying to write a customisable program so I didn’t inherit nn.Module, thank you!
Yes, I have called the function self.U.parameters() but it still has this error:
“optimizer can only optimize Tensors, but one of the params is Module.parameters”
I think the problem I am having is not inheriting nn.Module in my class. Therefore, the method .parameters() is not available to me.
Is there any other way that I could change nn.Embedding into something that the optimizer can understand? So that I can add it into the parameter list manually?
Hello, that’s the only error message I got.
so basically instead of using model.parameters(), I send it my own parameter list, parameter = [] into the optimizer.
I tried adding self.U.parameters() into the list and optimize it, the error was “optimizer can only optimize Tensors, but one of the params is Module.parameters”…
It seems that I have found the cause of your problem.
self.U.parameters() is not a parameter, but a list of parameters (see comments below).
import torch
from torch import nn
from torch.autograd import Variable
class Model(nn.Module) :
def __init__(self, nUser, edim_u, lr = 3e-4):
super(Model, self).__init__()
self.U = nn.Embedding(nUser, edim_u)
self.other_parameters = Variable(torch.zeros(nUser, edim_u))
self.lr = lr
# This will generate an error ([1]) because self.U.parameters() is not a parameter.
#self.params = [self.U.parameters(), self.other_parameters]
# To correct the problem.
self.params = [*self.U.parameters(), self.other_parameters] # = list(self.U.parameters()) + [self.other_parameters]
# Then
optimizer = torch.optim.SGD(self.params, lr = self.lr)
model = Model(2, 1)
# [1] TypeError: optimizer can only optimize Tensors, but one of the params is Module.parameters
I advise you to take a look at the documentation.
For example, it is requested in the documentation that the parameters should be Variables (see section Constructing it)