How to remove all torch in GPU

I was hoping there is a quick method to remove all pytorch from the GPU?

You could delete all tensors, parameters, models etc. and call empty_cache() afterwards to remove all allocations created by PyTorch. To also remove the CUDA context, you would have to shut down the Python session.

Is there an approach to have a list of all tensor that exist on the gpu?

No, I don’t think there is a way to list all tensors. You could try to use this approach, but as discussed in the topic tensors kept alive in the C++ backend won’t be returned by this approach.

I see the how to get the list of all tensor. However, I have hard time moving the tensor to cpu and then remove from from gpu.