Grad is always None for leaf variable

I am trying to look at the activations for the feature in a CNN.
However, when I look at the gradients of my img_var variable after calling loss.backward(), the gradient is always None. Note that the variable is always a leaf node.

class SaveFeatures():
    def __init__(self, module):
        self.hook = module.register_forward_hook(self.hook_fn)
    def hook_fn(self, module, input, output):
        self.features = torch.tensor(output,requires_grad=True).cuda()
    def close(self):
        self.hook.remove()

class FilterVisualizer():
    def __init__(self, size=224):
        self.size = size
        self.model = vgg16(pretrained=True).to(DEVICE)
        self.model.eval()

        for p in self.model.parameters(): p.requires_grad=False

    def visualize(self, layer, filter_n, lr=0.1, opt_steps=30):
        sz = self.size
        img = np.uint8(np.random.uniform(100, 200, (3, sz, sz)))/255  # generate random image

        activations = SaveFeatures(self.model.features[28])  # register hook

        img_var = torch.autograd.Variable(torch.Tensor(img).to(DEVICE).unsqueeze(0), requires_grad=True)  

        print(img_var.cpu())

        optimizer = torch.optim.Adam([img_var], lr=lr, weight_decay=1e-6)


        for n in range(opt_steps):  # optimize pixel values for opt_steps times
            optimizer.zero_grad()
            self.model(img_var)
            loss = -activations.features[0, filter_n].mean()
            loss.backward()

            print(img_var.grad)  ### This is always None
            optimizer.step()

        img = img_var.detach().cpu().numpy()[0].transpose(1,2,0)
        activations.close()


Hi,

You don’t need to use Variable anymore. You can call .requires_grad_() on your Tensor after building it.

The problem here I think is your “SaveFeatures”. You rewrap the output in a brand new Tensor. That means that you break the link with the existing graph.
Why not do self.features = output ?
Also this is not recommended to do this in general as states can be tricky to handle properly.
You can override the forward functino of the vgg16 net to remove what you don’t want maybe?

1 Like

Thank you, the problem was in fact in SaveFeatures.