What is Forward/backward pass size?

ccz

I use the pytorch’s summary library to summarize the size of your deep learning model.

There is a model with a small number of parameters, but a forward pass size is a large model. The forward pass size seems to be related to the speed of the model.

Probably I think that forward pass size means computed size. Is it the same as FLOPs?

I also wonder if the size of the model we are talking commonly about is the sum of both, or just the number of parameters. I am looking forward to your answer.

Hi,

All these numbers in MB correspond to the expected memory needed to run the model.

Not sure about its exact meaning. But I tried to change the input_shape and the statistics of “Params” STAY the same and the “Forward/backward pass size” changed. So guess it’s about FLOPs.