L2 Normalization Layer

Hi, I am trying to implement an L2 normalization layer. But I see that the F.normalize is not accepted by the sequential module as it requires an input. Is there an L2 normalization layer in pytorch?

self.backbone = torch.nn.Sequential(self.conv_b1,
                                            self.conv_b2,
                                            self.conv_b3,
                                            self.conv_b4_1,
                                            self.conv_b4,
                                            self.conv_b5,
                                             F.normalize())

TypeError: normalize() missing 1 required positional argument: ‘input’