I am trying to do this :
my_file3 = x_mean[1].detach().cpu().numpy()
and it shows me this error :
AttributeError: ‘float’ object has no attribute ‘detach’
I am trying to do this :
my_file3 = x_mean[1].detach().cpu().numpy()
and it shows me this error :
AttributeError: ‘float’ object has no attribute ‘detach’
Is the variable part of a graph? If it is not part of a graph or does not have a gradient you won’t be able to detach it from anything because it wasn’t attached in the first place.
Adding on to what @Dwight_Foster said, you can do my_file3 = (x_mean[1].detach() if x_mean[1].requires_grad else x_mean[1]).cpu().numpy()
to make it work in any case.
Hi Himanshu,
I tried what you have suggested but it is showing me this now
AttributeError: ‘float’ object has no attribute ‘requires_grad’
Are you sure that x_mean[1]
is a torch.Tensor
object?
Even if you take a member of a tensor that is a list, it will still have .requires_grad
as False
by default since it is of the torch.Tensor
class:
>>> import torch
>>> x_mean = torch.ones((50))
>>> x_mean.requires_grad
False
>>> x_mean[1].requires_grad
False
>>> type(x_mean[1])
<class 'torch.Tensor'>
Maybe you are appending the .item()
of the original tensor somewhere, which changes the class to a normal Python float:
>>> x_mean=[1.,2.,3.,4.,5.]
>>> type(x_mean[1])
<class 'float'>
>>> x_mean[1].requires_grad
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'float' object has no attribute 'requires_grad'