"requires_grad" after list addition

In below code, let say i have l1 and l2 loss which require grad is True and they are in form of list in variable “c”. After addition, requires_grad is “none”. I want to add the losses and want
“Total_L” requires_grad “True” such that it will backpropagate in backward pass.

import torch
import numpy as np
l1= torch.ones([1],requires_grad=True)
print("l1 is ",l1)

l2= torch.zeros([1],requires_grad=True)
print("l2 is ",l2)

comb_L=[l1,l2]
print("comb_L is ",comb_L)

Total_L=torch.tensor(comb_L).sum(0)
print("Total loss is ",Total_L,Total_L.grad_fn)

you’ll want to replace torch.tensor(comb_L) with torch.stack(comb_L) and that’ll have a grad_fn

import torch
import numpy as np
l1= torch.ones([1],requires_grad=True)
print("l1 is ",l1)

l2= torch.zeros([1],requires_grad=True)
print("l2 is ",l2)

comb_L=[l1,l2]
print("comb_L is ",comb_L)

Total_L=torch.stack(comb_L).sum(0)
print("Total loss is ",Total_L,Total_L.grad_fn)
1 Like