Anyone has a code for FLOP calculation of an epoch?

need to calculate flops during training (feedforward+backpropagation).
I would appreciate if anyone has the code to do so.

1 Like

there isn’t an official way to do this, but some folks have written ways of approximating it:



2 Likes

Thank you, but looks like they only consider the case when model is feed-forwarded or the inference time.
I need something to calculate FLOPs for both backpropagation and feed-forward.
Or am I wrong, these codes also work for backpropagation?

More likely the first one. Most papers and code are considered about complexity of the inference process

3 Likes

Open AI have a report on training FLOPs of various models:

and they used the following equation to calculate training FLOPs:

(add-multiplies per forward pass) * (2 FLOPs/add-multiply) * (3 for forward and backward pass) * (number of examples in dataset) * (number of epochs)
3 Likes