How to use “retain_variables” in Variable backward function.

I tried the following code:

import torch

from torch.autograd import Variable

x = Variable(torch.ones(2, 2), requires_grad = True)

y = x + 2

y.backward(torch.ones(2, 2), **retain_variables=True** )

print "first gradient of x is:"

print x.grad

z = y * y

gradient = torch.ones(2, 2)

z.backward(gradient)

print "second gradient of x is:"

print x.grad

import torch

from torch.autograd import Variable

x = Variable(torch.ones(2, 2), requires_grad = True)

y = x + 2

y.backward(torch.ones(2, 2), **retain_variables=False**)

print "first gradient of x is:"

print x.grad

z = y * y

gradient = torch.ones(2, 2)

z.backward(gradient)

print "second gradient of x is:"

print x.grad

Both print the same results:

first gradient of x is:

Variable containing:

1 1

1 1

[torch.FloatTensor of size 2x2]

second gradient of x is:

Variable containing:

7 7

7 7

[torch.FloatTensor of size 2x2]