Could you please let me know how I can count the number of flops related to the batch normalization layer theoretically?
FLOPs: Note that s is lowercase, which is the abbreviation of FLoating point OPerations (s stands for plural), which means floating point arithmetic, understood as calculation amount. It can be used to measure the complexity of the model. The evaluation of the complexity of neural network models should refer to FLOPs, not FLOPS.