Add ReLU activation to last layer of pre-trained VGG16

Hi everyone,

I would like to add a ReLU activation to the last linear layer of a pre-trained VGG16 force the outputs to be positive.

I tried the following:

m = models.vgg16(pretrained=True)
m.classifier[6] = nn.Linear(4096, 4)
m.classifier[6].act = nn.ReLU(inplace=True)

Printing the model gives for the classifier part:

(classifier): Sequential(
    (0): Linear(in_features=25088, out_features=4096, bias=True)
    (1): ReLU(inplace=True)
    (2): Dropout(p=0.5, inplace=False)
    (3): Linear(in_features=4096, out_features=4096, bias=True)
    (4): ReLU(inplace=True)
    (5): Dropout(p=0.5, inplace=False)
    (6): Linear(
      in_features=4096, out_features=4, bias=True
      (act): ReLU(inplace=True)
    )

I get negative outputs with this. Any hints on how to do this highly appreciated.
Thank you!

maybe,

x.classifier.add_module('7', nn.ReLU())
1 Like

Works perfectly, thank you!