Gradient is lost when use torch.tensor([t1, t2])

y1 is ok, but y2 lose the gradient of a

import torch
a = torch.tensor(0.0)
b = torch.tensor(0.0)

a.requires_grad = True
b.requires_grad = True

cosa = torch.cos(a)
cosb = torch.cos(b)

y1 = cosa + cosb
y2 = torch.tensor([
    cosa, cosb
])

y2.sum().backward()  # RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

print("a has grad ", a.requires_grad)
print("a grad", a.grad)

Updated:
Besides, I need y2 to be matrix to support following operations.

y2 = torch.tensor([
    cosa, cosb
])

mat = torch.tensor([
    [1.0, 2.0],
    [3.0, 4.0]
])

y2 = torch.matmul(y2, mat)
y2.sum().backward()

Maybe like this is better:

import torch
a = torch.tensor([0.0], requires_grad=True)
b = torch.tensor([0.0], requires_grad=True)

cosa = torch.cos(a)
cosb = torch.cos(b)

y1 = cosa + cosb
y2 = torch.cat([
    cosa, cosb
], dim=0)

y2.sum().backward() 

print("a has grad ", a.requires_grad)
print("a grad", a.grad)
1 Like