Hey!

I recently started with Pytorch and right now I am facing a problem with computing the gradients.

I have a list of Tensors (L) with `required_grad = True`

, that is stucked in one Tensor: `T=torch.stack(L,dim=0)`

.

So, when I am trying to compute the gradients from all elements from the Tensor T with `out.backward()`

, `T[0].grad`

is always `None`

.

How can I get the gradients in this case?