How to copy a Variable in a network graph

The clone operation corresponds to making a copy of the Tensor contained in this variable.
That means that xx will be a new Variable with its history linked to it.
When you perform the backward pass, the gradients will only be accumulated in the Variables that you created (we call them leaf Variables) and for which you set requires_grad=True:

import torch
from torch.autograd import Variable

def basic_fun(x):
    return 3*(x*x)

def get_grad(inp, grad_var):
    A = basic_fun(inp)
    A.backward()
    return grad_var.grad

x = Variable(torch.FloatTensor([1]), requires_grad=True)
xx = x.clone()

# Grad wrt x will work
print(x.creator is None) # is it a leaf? Yes
print(get_grad(x, x))
print(get_grad(xx, x))

# Grad wrt xx won't work
print(xx.creator is None) # is it a leaf? No
print(get_grad(xx, xx))
print(get_grad(x, xx))
3 Likes