Intentional constant loss

Is there a way to fix the training loss to a certain value?

Do you mean at the beginning of the training or during the training?
Could you explain your use case as I’m not sure what you’re trying to do exactly.

Umm, something like this during training:

...
loss = 6000
loss.backward()
...

That’s not really possible, as in your code the loss is completely detached from any other operation, besides not being a torch.tensor.

I’m not sure, if that’s what you are looking for, but you could try to scale your loss to a specific value:

x = torch.randn(1, 1)
w = torch.randn(1, 1, requires_grad=True)
output = x * w
target = torch.randn(1, 1)
loss = (output - target)**2
factor = 6000. / loss.item()
loss = loss * factor
loss.backward()
print(w.grad)
1 Like

Yes, that’s what i was looking for. I didn’t know how to attach a specific value as loss.