I used the following code to count all the trainable parameters in my models:

print("numparams")
print (sum(p.numel() for p in model.parameters() if p.requires_grad))

and I notice as I change the number of classes in my dataset, the number of trainable parameters changes too. From 7351300 for 4 classes to 7354378 for 10 classes.

I read some blogs and thought that parameters only involved the number of layers and neurons. Is this not true?

When you change the number of classes, the last layer changes so the number of trainable parameters.
just assume that the last layer of your model is a simple MLP. When you change the number of classes the number of trainable parameters change.

@peony Generally, it does not change, as long as you do not change the architecture of your ANN. For a â€śnormalâ€ť feed forward model, you have the weights and the bias per layer. Keep in mind that the optimizer and batch normlalization layers can include paramters that will be learned, too.
In your given example donâ€™t forget that the last layer is connected with weights with the output neurons as well, which explaines why the number of paramters increases if you change the number of classes from 4 to 10.
Letâ€™s say your last layer has 100 neurons and then you have either 4 or 10 outputs neurons (number of classes), you would end up with 1004 vs. 10010 weights â†’ 400 vs. 1000 trainable paramters.