How to manually register variables to a module's parameters

Can someone help me understand what determines the contents of model.parameters()? I’m guessing it’s because self.w and self.b aren’t modules. Here’s the code:

from finch.viz import scatter_plot
import numpy as np
import torch
import torch.autograd
import torch.nn.functional as F
import torch.optim as optim
from torch.autograd import Variable
from torch import nn
tensor = torch.FloatTensor

COUNT = 10

################################################################################

def scalar(x):
  return torch.FloatTensor([x])

################################################################################

class Net(nn.Module):
  def __init__(self):
    super(Net, self).__init__()
    self.w = Variable(scalar(0.1), requires_grad=True)
    self.b = Variable(scalar(0), requires_grad=True)

  def forward(self, x):
    x = self.w * x + self.b

  def loss(self, prediction, label):
    return (prediction - label)**2

################################################################################

data = np.random.standard_normal((COUNT, 1)) + 5
labels = (data * 3) + 5 + np.random.standard_normal(COUNT)

model = Net()
optimizer = optim.SGD(model.parameters())

for datum, label in zip(data, labels):
  datum, label = Variable(scalar(datum)), Variable(scalar(label))

  optimizer.zero_grad()

  prediction = model(datum)
  loss = model.loss(prediction, label)

  loss.backward()
  optimizer.step()

  print('loss', loss)

model.w and model.b are part of the forward pass, but model.parameters() is empty. How can I register those so that model.parameters() isn’t empty?

1 Like

Replace Variable with nn.Parameter

3 Likes

Main thing that determines it is Module’s __setattr__, here.

Perfect, thank you!​​​​​​​​​​​​​​​​​​​​

note that doing:

    self.W = torch.nn.Parameter( w_init )
    self.mod_list = torch.nn.ModuleList([self.W])

does not work (and note w_init should NOT be a Variable, seems tensors work). Not sure why it shouldn’t.