del model
torch.cuda.empty_cache()
print("After Clearing Model: ", torch.cuda.memory_allocated(device))
I have loaded a model to GPU (model.to(device)) and then am trying to delete it to free up some memory. However, the memory allocated does not go down. Am i missing something?