Getting memory at inference


I am trying to figure out how much RAM is used while running a trained model on a single input like:

model = #model definition and loading trained weights
x = image #load image
y = model(x) #run model on image

I want to basically get the memory usage of this script

On Linux, I’d probably use /usr/bin/time -v (you need the path for time because you don’t want the builtin function. This will report the Maximum resident set size (kbytes):.

is this approach recommended over using pytorch functions like those described here?:

specifically I am trying to find out how much memory is used on the CPU and GPU for this program