What is the difference between autograd.Variable and nn.Parameter?

I’m confusing when to use each of them and how they are different.
can anyone explain me?

4 Likes

nn.Parameter is a subclass of nn.Variable so most behaviors are the same.
The most important difference is that if you use nn.Parameter in a nn.Module's constructor, it will be added into the modules parameters just like nn.Module object do. Here is an example:

import torch

class MyModule(torch.nn.Module):

    def __init__(self):
        super().__init__()
        self.variable = torch.autograd.Variable(torch.Tensor([5]))
        self.parameter = torch.nn.Parameter(torch.Tensor([10]))

net = MyModule()
for param in net.parameters():
    print(param)

"""
output:
Parameter containing:
tensor([10.], requires_grad=True)
"""

There’re no self.variable, only the self.parameter, and that means if we create optimizer with the net.parameters() as the first params and call optimizer.step(), only the self.parameter will be automatically optimized.

9 Likes

Also Variables are not needed anymore. You can simply use Tensors. And a Parameters is a specific Tensor that is marked as being a parameter from an nn.Module and so will be returned when calling .parameters() on this Module.

5 Likes