Why do we use __constants__ (or Final)?

Hi All,

I’m looking at the TorchVision implementation of GoogLeNet and I see that in the inception block is used __constants__ in the class definition. I read the documentation but I still don’t understand how it works and what happen if I remove it. In my understanding is something used by TorchScript but I don’t have yet the full picture (probably because I need to learn more about TorchScript).

I also tried:

inception_block = Inception(192, 64, 96, 128, 16, 32, 32)
inception_block = torch.jit.script(inception_block)
inception_block

And I don’t receive any error even if I remove __constants__ = ['branch2', 'branch3', 'branch4'] in the class definition.

class Inception(nn.Module):
    __constants__ = ['branch2', 'branch3', 'branch4']

    def __init__(self, in_channels, ch1x1, ch3x3red, ch3x3, ch5x5red, ch5x5, pool_proj,
                 conv_block=None):
        super(Inception, self).__init__()
        if conv_block is None:
            conv_block = BasicConv2d
        self.branch1 = conv_block(in_channels, ch1x1, kernel_size=1)

        self.branch2 = nn.Sequential(
            conv_block(in_channels, ch3x3red, kernel_size=1),
            conv_block(ch3x3red, ch3x3, kernel_size=3, padding=1)
        )

        self.branch3 = nn.Sequential(
            conv_block(in_channels, ch5x5red, kernel_size=1),
            conv_block(ch5x5red, ch5x5, kernel_size=3, padding=1)
        )

        self.branch4 = nn.Sequential(
            nn.MaxPool2d(kernel_size=3, stride=1, padding=1, ceil_mode=True),
            conv_block(in_channels, pool_proj, kernel_size=1)
        )

    def _forward(self, x):
        branch1 = self.branch1(x)
        branch2 = self.branch2(x)
        branch3 = self.branch3(x)
        branch4 = self.branch4(x)

        outputs = [branch1, branch2, branch3, branch4]
        return outputs

    def forward(self, x):
        outputs = self._forward(x)
        return torch.cat(outputs, 1)

Can you explain me better what is the utility of adding constants and what happen if I don’t do it in a case like this one?

Thanks,
Mario

The relevant docs can be found in the “ How do I store attributes on a ScriptModule” question here. The idea is that if the jit knows certain values are constant and unchanging, then more aggressive optimizations and re-ordering can be done with those values versus things that are just regular attributes on a model.

So when you remove __constants__ or a Final type annotation the model’s behavior shouldn’t change, but less information is available to the jit about what it can do with your code. The jit’s type system will enforce that these values are not mutated, so that can make your code cleaner as well.

4 Likes

Thanks Driazati!

I still have some confusion by the fact that here we are marking as __constants__ blocks that have weights that will change during back-propagation, again I’m not an expert of jit so is possible that I’m missing something that doesn’t regard __constants__ at all.

With __constants__ are we just saying that the shape and type will not change but we are ok if the parameter values change?

I understand something like
__constants__ = ['a'] followed by a self.a = 4 because I can see that 4 is a constant value that can be hard-coded but I’m still confused by how blocks with weights are treated.

Thanks again,
Mario

Ah yeah, the case you’re seeing here is a hack that we’ve been using for a while, basically some modules can have optional submodules (i.e. either the submodule can be present or None). When we script something we just so happen to add submodules first, then constants, skipping any names that are already present on the module, so it either gets added as a normal submodule (ignoring the entry in __constants__) or as a None constant. If the compiler sees a None constant in an if-statement, it will skip compilation of the code inside the if, allowing us to support uses like downsample in resnet.

So if you see a nn.Module in __constants__, all it really means is Optional[nn.Module], we just have this kind-of-nonsense way to specify that.

3 Likes

Ahah great now I get it and it makes more sense ;)! Thanks for the answer!

1 Like