Pre allocate necessary GPU

Hi all,
Is there any function for pre allocate necessary GPU memory and keep it fixed for the entire training loop.?

This does not exists.
But pytorch is using a custom memory allocator on the GPU that does not release memory right away and will reuse it. Hence having similar behaviour.

1 Like

Hi @albanD,
Suppose my code took 5 GB of GPU memory. I freed some of the variables using del x (say around 3gb). So now I still have the 5 GB allocated to my process (3 GB in use and 2 GB is free) and the other processes cannot use this 5 GB. Which implies I have 5 GB of memory for my process. Am I correct?

Yes exactly. That memory will still be available for your process (but not others).

1 Like