Model partial derivatives into loss function

I want to force a monotonic relationship between a set of features and the model’s outcome, and I would like to do it by controlling the model’s partial derivatives into the loss function, like a regularization term.

I tried the solution proposed here, but I got an error:

one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [2, 1]], which is output 0 of TBackward, is at version 2; expected version 1 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).

Is there a way to include the model’s partial derivatives into the loss?

Hi,

This is the same error as the one from the post you linked. Does the answer there help you?

Hi Alban,

I tried the solution proposed (simply copy pasting) but is not working.