Memory Consuption

How to check GPU memory consumption of the model on a given input and also size of the model? torchsummary do not work properly in my case.

Hi,

Unfortunately, this is very hard to predict upfront.
If you want to have an exact value, to know the max memory consumption (including autograd buffers), the best way is to send your model to the gpu and do a forward backward.

Hello, thanks for quick answer! In this situation, I only need GPU memory consumption of forward pass. Also, is there a way to know total size of the model (using special methods or by using number of parameters)?

Unfortunately, modules can save arbitrary elements during forward. So it is really hard to have a common api that computes this.
If torchsummary does not work, you most likely will have to run a forward and check the actual consumption.

Thanks! Is there any tutorials to how manage this correctly (run a forward and check the actual consumption)?

You can use regular python tools to check this as you are looking for cpu memory.
For example, this memory profiler gives very nice outputs: https://pypi.org/project/memory-profiler/

1 Like

You can monitor the status of your GPU via nvidia-smi.
E.g., you can create a simple script that only passes your model to GPU and then waits. Monitoring the GPU usage of such script may give you an idea.

1 Like