Forward Block in Inception Network for creating BasicConv2d.?


I just want to know while creating the Inception Network “BasicConv2d” why author implemented with forward block inside class BasicConv2d why cant’t we use just a Function as defined below named def conv3x3??

Is there any special need to define class BasicConv2d(nn.Module) instead of def conv3x3.

class BasicConv2d(nn.Module):

def __init__(self, in_channels, out_channels, **kwargs):
    super(BasicConv2d, self).__init__()
    self.conv = nn.Conv2d(in_channels, out_channels, bias=False, **kwargs) = nn.BatchNorm2d(out_channels, eps=0.001)

def forward(self, x):
    x = self.conv(x)
    x =
    return F.relu(x, inplace=True) 

def conv3x3(in_planes, out_planes,kernel_size,**kwargs):
“”“3x3 convolution with padding”""
return nn.Conv2d(in_planes, out_planes, kernel_size, **kwargs)

The BasicConv2d module defines a batchnorm layer besides the conv layer and also applied a relu to the output., while conv3x3 just refines a conv layer.
Both approaches can be used in one way or the other, but they define different abstraction levels as explained before.

Thanks @ptrblck for replying.

If i use below function instead of class it will also works?

def conv2d():
conv2d = nn.Sequential(nn.Conv2d(in_channels,out_channels,**kwargs)
,nn.BatchNorm2d(out_channels,eps = .001)
return conv2d

Yes, this should return the nn.Sequential container with the specified modules.

1 Like