Why the grad of parameter is None?

I want to use feature_conv * self.a to select values in feature_conv, But the value in Parameter a is always be one.

I found the grad of a is None, how can I fix it?

    def _sel_CAM(self, feature_conv):

        #self.a = nn.Parameter(torch.ones((1,2048,1,1), dtype=torch.float32),requires_grad=True)
        #feature_conv is a torch tensor

        B, C, H, W = feature_conv.size()
        S = H*W

        cam = torch.sum(feature_conv*self.a, dim=1) / 2048
        cam = cam.view(B, -1)
        rank = torch.sort(cam, dim=1)[1].view(B, H, W)
        mask = rank >= (S-16)
        mask = mask.view(B, H, W)
        feature_conv = feature_conv * mask.unsqueeze(1)
        return feature_conv

Where are you backpropagating? Please post a minimum executable snippet that reproduces your error.