Batch norm layer in sequential stored as separate in state dict

Hi.

When defining a network as such:

class test(torch.nn.Module):
    def __init__(self):
        super(test, self).__init__()
        self.bn1 = torch.nn.BatchNorm2d(num_features=100, eps=1e-5)
        self.conv = torch.nn.Sequential(
            torch.nn.Conv2d(in_channels=3,out_channels=128,kernel_size=3),
            self.bn1
        )

if __name__ == "__main__":

    t = test()

The batch norm parameters appear twice in the state_dict. Me as a user expected only the bn1 to appear in the state_dict, but it appears twice. Once with the bn1 and once with the conv (from Sequential) prefix.
Is this desired? I would say no, since the object at heart is the same, but could be wrong.