Other functions CNN

that other functions apart from linear and log_softmax exist in pytorch,
How to configure in this function:

class ConvolutionalNetwork(nn.Module):
        def __init__(self):
            self.conv1 = nn.Conv2d(tipo_image, 6, kernel_size=(kernel,kernel), stride=stridee,padding=paddingg,bias=biass)
            self.conv2 = nn.Conv2d(6, 16, kernel_size=(kernel,kernel), stride=stridee,padding=paddingg,bias=biass)
            self.conv3 = nn.Conv2d(16, 32, kernel_size=(kernel,kernel), stride=stridee,padding=paddingg,bias=biass)

            self.fc1 = nn.Linear(f*f*32, 120)
            self.fc2 = nn.Linear(120, 84)
            self.fc3 = nn.Linear(84, 18)

        def forward(self, X):
            X = F.relu(self.conv1(X))
            X = F.max_pool2d(X, 2, 2)
            X = F.relu(self.conv2(X))
            X = F.max_pool2d(X, 2, 2)
            X = F.relu(self.conv3(X))
            X = F.max_pool2d(X, 2, 2)
            X = X.view(-1, f*f*32)
            X = F.relu(self.fc1(X))
            X = F.relu(self.fc2(X))
            X = self.fc3(X)
            return F.log_softmax(X, dim=1)
    CNNmodel = ConvolutionalNetwork()
    criterion = nn.CrossEntropyLoss()
    optimizer = torch.optim.Adam(CNNmodel.parameters(), lr=0.001)

@jasg Not sure if I understood your question. Could you please clarify it for me?

ok, apart from the nn.linear and softmax function that other alternatives I can use in the class ConvolutionalNetwork

This is the list of all Non-Linear activation functions available in pytorch:

I am sure you know this but just reiterating that we commonly use sigmoid(binary classification) and softmax/logsoftmax (multi-class).

Please let me know if you want more information.

I understand that since I have 18 classes I must use the softmax; but it is not clear to me what the other alternatives would be for the linear function, sorry this is new for me

I am not clear if you are just talking about nn.Linear now. If yes, this is basically adds a dense (fully connected) layer in your network.

I want to learn more about CNN, so I consult other options besides nn. Linear i can use

A typical cnn structure should be like this:
input -> convolution -> pooling -> convolution -> pooling -> fully connected (linear) -> output(softmax/sigmoid)

a convolution layer usually consists of con->batch norm -> non-linearity (e.g. relu)

Hope this helps.