Does the number of parameters of a network change as the amount of data changes?

I used the following code to count all the trainable parameters in my models:

  print("numparams")
  print (sum(p.numel() for p in model.parameters() if p.requires_grad))

and I notice as I change the number of classes in my dataset, the number of trainable parameters changes too. From 7351300 for 4 classes to 7354378 for 10 classes.

I read some blogs and thought that parameters only involved the number of layers and neurons. Is this not true?

Thank you in advance

When you change the number of classes, the last layer changes so the number of trainable parameters.
just assume that the last layer of your model is a simple MLP. When you change the number of classes the number of trainable parameters change.

1 Like

Thank you for explaining. Is 7 million parameters normal for CNN plus LSTM architecture?

Yes it is. But totally depends on your chosen architecture. For example VGG has approximatley135M and Resnet-18 has 11.5M parameters.

@peony Generally, it does not change, as long as you do not change the architecture of your ANN. For a “normal” feed forward model, you have the weights and the bias per layer. Keep in mind that the optimizer and batch normlalization layers can include paramters that will be learned, too.
In your given example don’t forget that the last layer is connected with weights with the output neurons as well, which explaines why the number of paramters increases if you change the number of classes from 4 to 10.
Let’s say your last layer has 100 neurons and then you have either 4 or 10 outputs neurons (number of classes), you would end up with 1004 vs. 10010 weights → 400 vs. 1000 trainable paramters.

1 Like