Put Variables put into a matrix, gradient return None

Hi, to illustrate the problem more clearly, this is the simplest case that duplicates the problem.

import torch
from torch.autograd import Variable

n_comps = 3
s1 = Variable(torch.ones(n_comps, 1, 1), requires_grad=True)
s2 = Variable(torch.ones(n_comps, 1, 1), requires_grad=True)

def scalar_tensor(s1,s2,n_comps):
    scalarMatrix = Variable(torch.zeros(n_comps, 2, 2), requires_grad=False)
    scalarMatrix[:,0,0] = s1
    scalarMatrix[:,1,1] = s2
    return scalarMatrix

scalarMatrixGT = Variable(torch.zeros(n_comps, 2, 2), requires_grad=False)
scalarMatrixEstimated = scalar_tensor(s1,s2,n_comps)
loss = (scalarMatrixEstimated - scalarMatrixGT).pow(2).sum()
loss.backward()
print(s1.grad)
print(s2.grad)

Return: None and None

I just try to put the variables inside a matrix and do some basic operations. Not sure why.

Regards,
Yuhang

2 Likes