If I add a mask tensor to a conv2d layer like the following codes; will it chage the original conv's parameters?

    def forward(self, input):
        if self.count%self.k==0:
            self.mask, rnn_x= self.com_mask(torch.randn())
            return F.conv2d(input, self.weight*self.mask, self.bias, self.stride,
                            self.padding, self.dilation, self.groups)
        else:
            return F.conv2d(input, self.weight, self.bias, self.stride,
                            self.padding, self.dilation, self.groups)

Will the code chage the conv’s original parameters(weight\bias tensor)? For example, if the mask tensor is full of 0; when then next backpropagation, grad will add to a 0 weight?