AttributeError: 'float' object has no attribute 'device'

Hi, I’m running in this error after 9 epochs of multi-GPU training using DataParallel. Any idea what’s going on?


  output = self.net(data)                                                                                                                                                                                
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1051, in _call_impl                                                                                                       
    return forward_call(*input, **kwargs)                                                                                                                                                                  
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/data_parallel.py", line 169, in forward                                                                                                   
    return self.gather(outputs, self.output_device)                                                                                                                                                        
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/data_parallel.py", line 181, in gather                                                                                                    
    return gather(outputs, output_device, dim=self.dim)                                                                                                                                                    
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/scatter_gather.py", line 78, in gather                                                                                                    
    res = gather_map(outputs)                                                                                                                                                                              
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/scatter_gather.py", line 73, in gather_map                                                                                                
    return type(out)(map(gather_map, zip(*outputs)))                                                                                                                                                       
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map                                                                                                
    return type(out)(((k, gather_map([d[k] for d in outputs]))                                                                                                                                             
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/scatter_gather.py", line 69, in <genexpr>                                                                                                 
    return type(out)(((k, gather_map([d[k] for d in outputs]))                                                                                                                                             
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map                                                                                                
    return type(out)(((k, gather_map([d[k] for d in outputs]))                                                                                                                                             
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/scatter_gather.py", line 69, in <genexpr>                                                                                                 
    return type(out)(((k, gather_map([d[k] for d in outputs]))                                                                                                                                             
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/scatter_gather.py", line 63, in gather_map                                                                                                
    return Gather.apply(target_device, dim, *outputs)                                                                                                                                                      
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/_functions.py", line 56, in forward                                                                                                       
    assert all(i.device.type != 'cpu' for i in inputs), (                                                                                                                                                  
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/parallel/_functions.py", line 56, in <genexpr>                                                                                                     
    assert all(i.device.type != 'cpu' for i in inputs), (                                                                                                                                                  
AttributeError: 'float' object has no attribute 'device'

Hi @ash_gamma,

Can you check the type of data via type(data)? Is it type float?

You need to make sure that your data variable is type torch.Tensor in order for it to have the .device attribute. You can do this via torch.tensor(data), torch.from_numpy(data) etc…

Hi, the data is type torch.Tensor since the train/val cycle runs fine for 8 epochs.

could you please print the “i” in inputs?