Replace some layer in pytorch

I define some layer by myself, and I want to replace some layers in origin Net using my layer.
For Example:
Net define:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(3, 6, 5)
        self.conv2 = nn.Conv2d(6, 16, 3)
        self.conv3 = nn.Conv2d(16, 32, 3)
        self.pool = nn.MaxPool2d(2, 2)
        self.fc1 = nn.Linear(32, 10)

    def forward(self, x):
        x = self.pool(F.relu(self.conv1(x)))
        x = self.pool(F.relu(self.conv2(x)))
        x = self.pool(F.relu(self.conv3(x)))
        x = x.view(-1, 32)
        x = self.fc1(x)
        return x

In this Net, F.relu is used. However, I want to replace the relu in this Net.
Therefore, if I print this Net() :

Net(
  (conv1): Conv2d(3, 6, kernel_size=(5, 5), stride=(1, 1))
  (conv2): Conv2d(6, 16, kernel_size=(3, 3), stride=(1, 1))
  (conv3): Conv2d(16, 32, kernel_size=(3, 3), stride=(1, 1))
  (pool): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
  (fc1): Linear(in_features=32, out_features=10, bias=True)
)

which I found that Relu operation not existed.

So I want to know If I want to modify relu operation in Net(Net can not be rewrited), How can I deal with it?
if jit.script can help me to do this process?Preformatted text

Since relu is used as a functional call, the cleanest way would be to override the class and reimplement the forward method with your new non-linearity.
By doing so, you could also create the new activation function as an nn.Module, so that it can be replaced easily, if that’s your use case.

The situation or the request for me is that I get an model and Net define from others, And I need to design a framework to replace some operation such as relu or others. So modify net define by hand is not realistic for me.
If there has any other advance?

You could try to override the F.relu directly with your desired method, but I would consider this quite a hack, since this will invisibly redefine this method.