Hi,
I am trying to add an item to the loss function. The item’s required_grad = False.
Here is the structure:

pred, x = net(input)
loss = nn.MSELoss()(pred, target) + custom_function(x)

The item: custom_function(x) is “required_grad=False”
I know the during the grad process, this item is not going to help with gradient. But it can help decide the value of the loss.item(). Would this item play any role in decreasing the loss?

No, adding a constant to a function will not change it’s derivative and thus gradients:

x = torch.randn(1)
w = torch.randn(1, requires_grad=True)
target = torch.ones(1)
out = x * w
loss = F.mse_loss(out, target)
print(loss)
# tensor(1.0523, grad_fn=<MseLossBackward0>)
loss.backward()
print(w.grad)
#tensor([-0.3013])
w.grad = None
out = x * w
loss = F.mse_loss(out, target)
loss = loss + 1000.
print(loss)
# tensor(1001.0523, grad_fn=<AddBackward0>)
loss.backward()
print(w.grad)
# tensor([-0.3013])