Can I use nn.Parameter instead of nn.autograd.Variable?

feeling confused about these two…

Parameter is basically something extra in addition to Variable. E.g., if you want to optimize the tensor values also, then you would use Parameter.

E.g., let’s say you define a model via nn.Module

class ResNet(nn.Module):
    ...

And then pass it to the optimizer:

...
model = resnet18(num_classes)
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)
...

You would need Parameter such that it is included in the module’s parameter list for optimization when calling model.parameters(). Note that autograd works just fine with Variable, but if you don’t use Parameter, these values won’t get optimized then. Also note that layers like nn.Linear() already use Parameter for the weights and bias units, and you don’t have to do anything extra there.

1 Like