Pytorch:optimizer got an empty parameter list

I’m practicing pytorch with an Example from an Internet blog.
But, I have a problem with module.nn.parameters() version is python 3.7

import torch
from torch.autograd import Variable

x_data=torch.tensor([[1.0],[2.0],[3.0]])
y_data=torch.tensor([[2.0],[4.0],[6.0]])

class MyModel(torch.nn.Module):
def init(self):
super(MyModel,self).init()
self.linear=torch.nn.Linear(1,1)

def forward(self,x):
    y_pred=self.linear(x)
    return y_pred

model=MyModel()

criterion=torch.nn.MSELoss(size_average=False)
optimizer=torch.optim.SGD(list(model.parameters()),lr=0.01)

for epoch in range(501):
y_pred=model(x_data)
loss=criterion(y_pred,y_data)
if epoch%100==0:
print(epoch,loss.data[0])
optimizer.zero_grad()
loss.backward()
optimizer.step()

Your code is working fine, besides some minor issues:

  • Variables are deprecated since PyTorch 0.4.0, so use tensors instead
  • use loss.item() to print our the loss value instead of .data

It was my mistyping. Now I fixed them(I used just a single underbar at init).
Sorry for my mistake and thank you so much for your feedback :slight_smile: