How to update weights using part of grad

If I have the output ,My loss = MSELoss(output,target)+TVLoss(output),but in MSELoss,I want to zero part of grad of output , if I use register_hook , TVLoss will be influenced , so what should I do?

mask your mse before adding to tv

if you want to mask out a part of the feature layer:
MSELoss(mask*output,mask*target)
if you want to mask out the gradients in particular, you have to do MSELoss().backwards() and TVLoss(). backwards() seperately. you will need to create a backward hook that keeps a flag to specify whether it should modify gradients or not:

class backhook():
  def __init__(self,mask):
    self.do_mask = False
    self.mask = mask
    def hook(obj,gradin,gradout):
      if self.do_mask:
        gradout = self.mask * gradout
    self.hook = hook 
hook = backhook(mask)

param.register_backward_hook(hook.hook)

and before the MSE backward:

hook.do_mask = True
#if you want you can change hook.mask as well here
MSELoss().backward()
hook.do_mask = False
# don't do zero_grad let the grad from MSE add up here
TVLoss().backward()
opt.step()