How to replace one type layers by another type layers in pretrained network?

As the topic, if I want to change all ReLU activation function to ReLU6 or others in pretrained networks(e.g. resnet18), how can I achieve it in a convenient way?

(maybe in for loops instead of using the index of the target layer)

The structure of pretrianed model includes nn.Model and nn.Sequential, so I can not change them easily.

Could you give me some hints? Any advice would be appreciated!

1 Like

If you just would like to swap the ReLU activation for another one, you could just access self.relu directly:

model = models.resnet18()
model.relu = nn.ReLU6()
1 Like