I define some layer by myself, and I want to replace some layers in origin Net using my layer.
For Example:
Net define:
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.conv1 = nn.Conv2d(3, 6, 5)
self.conv2 = nn.Conv2d(6, 16, 3)
self.conv3 = nn.Conv2d(16, 32, 3)
self.pool = nn.MaxPool2d(2, 2)
self.fc1 = nn.Linear(32, 10)
def forward(self, x):
x = self.pool(F.relu(self.conv1(x)))
x = self.pool(F.relu(self.conv2(x)))
x = self.pool(F.relu(self.conv3(x)))
x = x.view(-1, 32)
x = self.fc1(x)
return x
In this Net, F.relu is used. However, I want to replace the relu in this Net.
Therefore, if I print this Net() :
Net(
(conv1): Conv2d(3, 6, kernel_size=(5, 5), stride=(1, 1))
(conv2): Conv2d(6, 16, kernel_size=(3, 3), stride=(1, 1))
(conv3): Conv2d(16, 32, kernel_size=(3, 3), stride=(1, 1))
(pool): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(fc1): Linear(in_features=32, out_features=10, bias=True)
)
which I found that Relu operation not existed.
So I want to know If I want to modify relu operation in Net(Net can not be rewrited), How can I deal with it?
if jit.script can help me to do this process?Preformatted text