Remove all RELU layers from Resnet model in Pytorch

Can anyone suggest me on how to remove all RELU layers from Resnet model? And if possible replace by some linear function

You could replace the self.relu of the base model as well as all blocks by your desired activation.
This code snippet should work:

model = models.resnet50()

names = []
for name, module in model.named_modules():
    if hasattr(module, 'relu'):
        module.relu = nn.Sigmoid()
        
print(model)
1 Like

Thank you. But what if I want to remove relu or any kind of activation layers completely?

Then you could replace it with nn.Identity() instead of another activation function.

Oh yea. This is easier to do than deleting the RELU layer. Awesome