Tanh fusion with conv for quantization

Hello,

I am trying to quantize post training an augmented resnet model that uses tanh activation on the extracted features. I know that the model fusion currently supports conv+bn+relu combinations. But is there a way to fuse a tanh activation to its previous layer?
Here is my model setup for the last layers of my augmented resnet:

class Resnet(nn.Module):
    def __init__():
        [previous layers]
        self.layer4 = self._make_layer(block, 512, layers[3], stride=2,
                                       dilate=replace_stride_with_dilation[2])
        self.avgpool = nn.AdaptiveAvgPool2d((1, 1))

        # adding conv layer
        self.features_conv = nn.Sequential(
            nn.Conv2d(self.last_channel, hidden_feats, kernel_size=1),
            norm_layer(hidden_feats),
            nn.ReLU(inplace=True)
        )

        self.attention = nn.Sequential(
            nn.Conv2d(hidden_feats, hidden_feats, kernel_size=1),
            nn.Tanh()
        )

    def forward():
        [previous layers]
        x = self.layer4(x)

        out = self.features_conv(x)
        x_att = self.attention(out)
        out = out * x_att

        out = self.avgpool(out)
        out = torch.flatten(out, 1)

        return out

At the moment the tanh fusion is not supported. @raghuramank100 Any plans for this?

We currently don’t have plans to support this. Please file a feature request on github or feel free to submit a PR to implement this and we can review it.
Thanks!