How to clear GPU memory after using model?

I’m trying to free up GPU memory after finishing using the model.

  • I checked the nvidia-smi before creating and trainning the model: 402MiB / 7973MiB
  • After creating and training the model, I checked again the GPU memory status with nvidia-smi: 7801MiB / 7973MiB
  • Now I tried to free up GPU memory with:
del model
torch.cuda.empty_cache() 
gc.collect()

and checked again the GPU memory: 2361MiB / 7973MiB

  • As you can see not all the GPU memory was released.
  • I can only relase the GPU memory via terminal (sudo fuser -v /dev/nvidia* and kill pid)

Is there a way to free up the GPU memory after I done using the model ?