I have two predications a
and b
, how can I set constraints or define a loss that minimizes for condition a>b
, where a
and b
are regression outputs of my network.
In other words, I want to teach the network that a
should always be greater than b
.
Thanks for your reply, do you know how can I find the source code for it? I reached to this in pytorch:
def margin_ranking_loss(input1, input2, target, margin=0, size_average=None,
reduce=None, reduction='mean'):
# type: (Tensor, Tensor, Tensor, float, Optional[bool], Optional[bool], str) -> Tensor
r"""margin_ranking_loss(input1, input2, target, margin=0, size_average=None, reduce=None, reduction='mean') -> Tensor
See :class:`~torch.nn.MarginRankingLoss` for details.
""" # noqa
if not torch.jit.is_scripting():
tens_ops = (input1, input2, target)
if any([type(t) is not Tensor for t in tens_ops]) and has_torch_function(tens_ops):
return handle_torch_function(
margin_ranking_loss, tens_ops, input1, input2, target, margin=margin,
size_average=size_average, reduce=reduce, reduction=reduction)
if size_average is not None or reduce is not None:
reduction_enum = _Reduction.legacy_get_enum(size_average, reduce)
else:
reduction_enum = _Reduction.get_enum(reduction)
if input1.dim() == 0 or input2.dim() == 0 or target.dim() == 0:
raise RuntimeError(("margin_ranking_loss does not support scalars, got sizes: "
"input1: {}, input2: {}, target: {} ".format(input1.size(), input2.size(), target.size())))
return torch.margin_ranking_loss(input1, input2, target, margin, reduction_enum)
but iI cannot find the source code to see how they defined it, any suggestion/