Replacing all ReLU(inplace=True) with ReLU(inplace=False) for all pretrained models

Hey,

I want to replace all ReLU(inplace=True) with ReLU(inplace=False) - but I want it to be generic for all pretrained models - vgg16, resnet, etc.

Is it possible? since each model has different architecture and thus different hierarchy.

Thank you!

I think you should be able to use torch.fx with its ability to manipulate the graph as described here. In the example they are replacing add() with mul() calls and I assume you can use the same or similar approach to replace the ReLU modules.

Thank you, but this solution doesn’t work - maybe I need to make it recursive…
It tries to find the function calls, but there might be these function calls within a sequential layer or any other embedded layer…