Pytorch Cuda Free GPU Memory

del model
torch.cuda.empty_cache()
print("After Clearing Model: ", torch.cuda.memory_allocated(device))

I have loaded a model to GPU (model.to(device)) and then am trying to delete it to free up some memory. However, the memory allocated does not go down. Am i missing something?

1 Like

You might have some references stored to this model, e.g. its output.
Make sure to delete all tensors etc. which might depend on the model instance.

Hey,
You also need to do this in order to kill the processes.
For that do the following:

  1. nvidia-smi
  2. In the lower board you will see the processes that are running in your gpu’s
  3. Check their PID
  4. Delete those processes making kill PID_Number

Hope it helps