How to move this to cpu or detach?

I want to move certain values outside GPU. How do I do it?
The following code is not working .

loss = criterion(outputs, labels)
acc = binary_acc(outputs, labels)                  
epoch_val_loss += loss.item().detach()
epoch_val_acc += acc.item().detach()

It is throwing the following error :

AttributeError: ‘float’ object has no attribute ‘detach’

I even tried this but it is not working

epoch_val_loss += loss.item().cpu()
epoch_val_acc += acc.item().cpu()

This throws the error :

AttributeError: ‘float’ object has no attribute ‘cpu’

Please help!

tensor.item() is returning a Python scalar value, which is already detached and on the CPU.
This call will fail if tensor contains more than a single value and thus cannot be represented by a Python scalar.

1 Like

I have another doubt… the output of a neural network is stored in CPU or GPU? what should I do to get it out of GPU if its in GPU?

If depends if the parameters and also input were previously pushed to the GPU.
If so, you can push the tensor back to the host via tensor.cpu() or'cpu').

1 Like

The reason why i want to move these to CPU is because of limited memory in GPU. I have 2 more doubts

  1. Will using torch.cuda.empty_cache() at the end of every epoch during training help with that?
  2. Will using torch.cuda.empty_cache() at the end of every epoch affect the way the loss is backpropagated?
  1. No, as it will only free the cache and thus force PyTorch to re-allocate the memory via synchronizing malloc calls, which will slow down your code. You won’t avoid out-of-memory issues, but would allow other applications to use the freed GPU memory.

  2. No, freeing the cache will not have any functionality impact (besides slowing down your code).

Moving the output to the CPU will also not move the entire computation graph to it, but only this single value (which won’t save a lot of memory).
If you are running out of memory, you could decrease the batch size, use torch.utils.checkpoint, the new CPU-offloading util., mixed-precision training, etc.

1 Like