how to count model parameters in pytorch, torchstat packages didn’t update for a long time, meet some error when i use it.
Hi,
you can count them as follows:
num_params = sum(param.numel() for param in model.parameters())
or:
num_params = sum(param.numel() for param in model.parameters() if param.requires_grad)
to only consider trainable parameters.
1 Like
thank you!! if i want to see parameters in each layer, how to do it?
Maybe listing all modules in a model can be helpful if you want to see parameters in each layer:
for name, module in model.named_modules():
print(name, sum(param.numel() for param in module.parameters()))
It will print all modules and modules’ number of parameters including activation functions or dropout.
1 Like
You can use torchsummary library for that.
from torchsummary import summary
print(summary(model_name, (3,224,224) , 'cpu'))