I am new to pytorch. I am constructing simple pytorch model using with nn.sequential and using layers. I am getting different number of parameters. Can someone guide or explain the reason for different number of parameters in model. My implementation is as
import torch
import torch.nn as nn
from pytorch_model_summary import summary
class Net1(nn.Module):
def __init__(self):
super(Net1, self).__init__()
self.Seqconv= nn.Sequential(
nn.Conv2d(in_channels=3, out_channels=32, kernel_size=3, stride=2, padding= 1, bias=False),
nn.BatchNorm2d(32),
nn.ReLU(inplace=True)
)
def forward(self,x):
x= self.Seqconv(x)
return x
class Net2(nn.Module):
def __init__(self):
super(Net2, self).__init__()
self.conv=nn.Conv2d(in_channels=3, out_channels=32, kernel_size=(3,3),
stride=2, padding=1)
self.bn = nn.BatchNorm2d(32)
self.relu = nn.ReLU()
def forward(self,x):
x= self.relu(self.bn(self.conv(x)))
return x
x= torch.randn(1,3,224,224)
print(summary(Net1(), x, show_input=True))
# Yield Total params: 928
print(summary(Net2(), x, show_input=True))
# Yield Total params: 960
Perhaps I was a bit commanding in my way of formulating things. Note that my. Comment is about âasâ not about âimportâ VS âfrom⌠importâ. AFAIK there is no noticeable usage difference if you assign âasâ nn or just import the submodule. However, âasâ is intended to specify an identifier for the loaded module. Therefore it is quite silly to assign it to the same name it already has.
Yeah, I think you are right and I have to admit that I never questiond it as I automatically type it in the âas nnâ way based on the first official tutorials (e.g. ImageNet example, MNIST example).
Well perhaps it comes down to preference anyway! I was too strong in my statement. I donât think there is one right or wrong answer. To me, my way seems âcleanerâ but thatâs subjective.
Oh no, I really donât want to say you are wrong and I think the âas nnâ import is really not necessary.
After youâve mentioned it it was probably the first time I thought about it.
Sorry, donât want to hijack this topic for this discussion.