Number of parameters and FLOPs in network

There are two model. One model has number of parameters 1 M and second has number of parameters 3 M but both have the same FLOPs value. What will be the advantage of model having less number of parameters?

Size would be one advantage.

Can you bit explain how size would be the advantage? Is it require less hardware to load model or require less run time memory or faster than model?

For instance, say you want to run a model inference on a Raspberry Pi. Suppose you’ve got 4MB of RAM. Part of that will be for overhead. At half precision, you could run the 1MB model, which will use >2MB just for the weights.