Practical usage of nn.Variable() and nn.Parameter()

I’m trying to figure out the difference and the practical usage one could make of nn.Parameter() and nn.Variable().

For now, I’ve only got some experience in using nn.Embedding() which provides embeddings of specified dimension for labels/words in a dictionary.

But how nn.Parameter() and nn.Variable() can be used in practice? Could anyone provide me some use-case example to improve my understanding?

3 Likes

Variables are deprecated since PyTorch 0.4.0 (so for 4 years now :wink: ).
nn.Parameters wrap tensors and are trainable. They are initialized in nn.Modules and trained afterwards.
If you are writing a custom module, this would be an example how nn.Parameter is used:

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.param = nn.Parameter(torch.randn(1, 1))
        
    def forward(self, x):
        x = x * self.param
        return x

model = MyModel()
print(dict(model.named_parameters()))
# {'param': Parameter containing:
# tensor([[0.6077]], requires_grad=True)}

out = model(torch.randn(1, 1))
loss = out.mean()
loss.backward()

print(model.param.grad)
# tensor([[-1.3033]])
3 Likes

many thanks!

So, if I find code where Variable is used, like this one I’m currently looking at locuslab/qpth: A fast and differentiable QP solver for PyTorch, how should I consider/modify it?
Have they been absorbed by nn.Parameter() ?

1 Like

Hi Frederico!

At the most basic technical level, Variable has been absorbed by
Tensor.

In the old days (pre-deprecation), a Variable wrapped a Tensor,
adding the structure (such as the requires_grad property) necessary
for the Tensor to participate in autograd. Now Tensor has that
structure build in directly. (The cost in overhead is negligible if you
don’t use that structure.)

So you can use autograd to train a Tensor without further ado.

Parameter wraps Tensor to help Modules and Optimizers keep
track of what Tensors you want to train. If you have a trainable
Tensor in a Module you will typically want to wrap it in a Parameter
so that, for example, it will show up automatically when you call
my_module.parameters(), a useful convenience. (But, again, it
doesn’t need to be a Parameter to work with autograd.)

Best.

K. Frank

1 Like