Hey, everyone!
Is there any way to accumulate the gradients with the abs()? I want to do some analysis using the absolute value of the gradients.
The following is an example:
a1_grad = torch.autograd.grad(pred[0], input, create_graph=False, retain_graph=True)[0]
a2_grad = torch.autograd.grad(pred[10], input, create_graph=False, retain_graph=True)[0]
a12_grad = torch.autograd.grad(pred[0]+pred[10], input, create_graph=False, retain_graph=True)[0]
My goal is to let a12_grad == ( a1_grad.abs() + a2_grad.abs() )
Thanks in advance!