ERROR:optimizer got an empty parameter list

Pooling layers do not have trainable parameters and the optimizer is thus raising the error. Make sure to create layers with parameters if you want to train the model.

Sorry can’t what you are trying to explain. Can you please briefly describe it to me?

class ConvNet(nn.Module):

def init(self, num_classes = 10, activation_fn: str = ‘ReLU’):
super(ConvNet,self).init()

#Output size after convolution filter
#((w-f+2P)/s) + 1

def activation(self,x):
if self.activation_fn == ‘ReLU’:
return nn.ReLU(x)
if self.activation_fn == ‘GELU’:
return nn.GELU(x)
if self.activation_fn == ‘SiLU’:
return nn.SiLU(x)
if self.activation_fn == ‘Mish’:
return nn.Mish(x)

def forward(self, inputs, droprate, batch_norm, kernel_size, filter_org, activation_fn):
self.inputs = inputs
cur_in = inputs
in_channels = 3
kernels = [kernel_size]5
size = input.shape()
self.pool = nn.MaxPool3d(2)
for i in range(5):
if i > 1:
kernels[i] = kernels[i-1]filter_org
self.conv = nn.Conv3d(in_channels, in_channels
kernels[i][0], kernels[i], stride=1, padding=1)
self.batch_norm = nn.BatchNorm3d(in_channels
kernels[i][0])
if batch_norm == True:
cur_out = self.pool(activation_fn(self.batch_norm(self.conv(cur_in))))
else:
cur_out = self.pool(activation_fn(self.conv(cur_in)))

  cur_in = cur_out
  in_channels = in_channels*kernels[i][0]
  size = (((size[0] - kernels[i][0] + 2*1)/1 + 1)/2, ((size[1] - kernels[i][1] + 2*1)/1 + 1)/2, in_channels)

self.fc = nn.Linear(size[0]*size[1]*size[2], self.num_classes)
self.dropout = nn.Dropout(droprate)
cur_out = cur_out.view(-1, size[0]*size[1]*size[2])
cur_out = self.fc(self.dropout(cur_out))
self.fc2 = nn.linear(self.num_classes, self.num_classes)
cur_out = self.fc2(cur_out)
self.softmax = nn.Softmax(dim=1)
cur_out = self.softmax(cur_out)
return cur_out

Above is my model

But I am getting this error, Can anyone kindly help out?

Your model doesn’t seem to initialize any trainable layers or parameters in its __init__ method as it’s empty, so the error is expected.
Creating new modules in the forward method will recreate them in each iteration and will thus never train them so make sure to initialize them properly in the __init__ method as described in this tutorial.