Hi everyone,
Do you know a way to estimate the GPU memory capacity needed for the training or the test of a model depending on the model itself, the input size, the batch size.
Thank you.
Hi everyone,
Do you know a way to estimate the GPU memory capacity needed for the training or the test of a model depending on the model itself, the input size, the batch size.
Thank you.
Hi Pirate-Roberts,
You may find this repository useful. link
The gives a theoretical estimate. Actual memory used depends a lot on versions of CUDA, CUDNN and Pytorch being used.
I’ll check this, thank you!