As the topic, if I want to change all ReLU activation function to ReLU6 or others in pretrained networks(e.g. resnet18), how can I achieve it in a convenient way?
(maybe in for loops instead of using the index of the target layer)
The structure of pretrianed model includes nn.Model
and nn.Sequential
, so I can not change them easily.
Could you give me some hints? Any advice would be appreciated!