Hi everyone. I created a list containing 4 matrixes and I need to calculate the mean and the standard deviation of the columns of the first element of S:

RuntimeError: Can’t call numpy() on Tensor that requires grad. Use tensor.detach().numpy() instead.
I also tried to convert the list elements with np.array or np.asarray but it doesn’t work. How can I solve it?

Hi,
Seems like you are trying to convert a tensor that requires gradient directly into a numpy array, like so:

import torch
x = torch.tensor([2.0, 3.0], requires_grad=True)
x_num = x.numpy() # RuntimeError: Can't call numpy() on Tensor that requires grad. Use tensor.detach().numpy() instead.
print(x_num)

This error then occurs and is expected as converting the tensor to numpy breaks the graph and no gradients can be computed.

You can explicitly detach the tensor from the graph and then convert it to a numpy array like so:

If you require the gradients of the tensors later on you can use torch.sum(S[0], 1) and torch.std(S[0], 1) otherwise if you just want to store these information you need to use .detach() to first remove the gradient from the tensor, then .numpy() to turn the tensor to numpy array and then use your code to get the sum and std