Why the number of FLOPs won't change in a model because of activation function change

Hi.
I am using the following code to count the number of flops in my model.

def measure(model):
        model.eval()
        model.cuda()

        dummy_input = torch.randn(1, 2, 128, 128).cuda()

        macs, params = profile(model, inputs=(dummy_input,), verbose=0)
        macs, params = clever_format([macs, params], "%.3f")
        print("<" * 10)
        print("Flops:", macs)
        print("Parameters:", params)

I used different activation functions in my model like ReLU, Swish, Mish, TanhExp. But all have given same number of flops.

So my questions are

  1. Why the change in the flops is not happening because of change in activation function
  2. Am I using the correct approach for calculating the flops?