Model Memory Requirement Calculation for Inference?

Hi, I need some help/guidance on how to calculate the amount of memory required when performing a model inference.
So, as per my understanding, the memory will be used by the input image (uint8/float), model weights (FP32) and the activations generated during the Forward Pass. ( Please feel free to correct me if I am wrong here. )
But how do I calculate this using PyTorch for a specific model, say ResNet18?