Computing matrix derivatives using autograd

I am running into few issues while trying to compute matrix derivatives. Here is a minimal working code to reproduce the error.

theta = torch.tensor(np.random.uniform(low=-np.pi, high=np.pi), requires_grad=True)
rot_mat = torch.tensor([[torch.cos(theta), torch.sin(theta), 0], 
                        [-torch.sin(theta), torch.cos(theta), 0]], 
                        dtype=torch.float, requires_grad=True)
torch.autograd.grad(outputs=rot_mat, 
                    inputs=theta, grad_outputs=torch.ones_like(rot_mat), 
                    create_graph=True, retain_graph=True)

The above code results in this error β€œOne of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.”

I tried using allow_unused=True but the gradients are returned as None. I am not sure what is causing the graph to be disconnected here. Any help is appreciated.

torch.tensor does not backpropagate, you should use cat/stack instead

2 Likes