I am not sure why TorchInfo is not reporting the size of the model.
TinyVGG –
Total params: 108,343
It works for me using the latest torchinfo
version:
from torchinfo import summary
model = models.resnet18()
batch_size = 16
summary(model, input_size=(batch_size, 3, 224, 224))
# ==========================================================================================
# Layer (type:depth-idx) Output Shape Param #
# ==========================================================================================
# ResNet [16, 1000] --
# ├─Conv2d: 1-1 [16, 64, 112, 112] 9,408
# ├─BatchNorm2d: 1-2 [16, 64, 112, 112] 128
# ├─ReLU: 1-3 [16, 64, 112, 112] --
# ├─MaxPool2d: 1-4 [16, 64, 56, 56] --
# ├─Sequential: 1-5 [16, 64, 56, 56] --
# │ └─BasicBlock: 2-1 [16, 64, 56, 56] --
# │ │ └─Conv2d: 3-1 [16, 64, 56, 56] 36,864
# ...
# │ │ └─ReLU: 3-51 [16, 512, 7, 7] --
# ├─AdaptiveAvgPool2d: 1-9 [16, 512, 1, 1] --
# ├─Linear: 1-10 [16, 1000] 513,000
# ==========================================================================================
# Total params: 11,689,512
# Trainable params: 11,689,512
# Non-trainable params: 0
# Total mult-adds (G): 29.03
# ==========================================================================================
# Input size (MB): 9.63
# Forward/backward pass size (MB): 635.96
# Params size (MB): 46.76
# Estimated Total Size (MB): 692.35
# ==========================================================================================
Not sure if this is going to help, but it was working originally. Then something changed (not sure what) and it doesn’t show it anymore.