According to my understanding we are searching the minimum values (wights) for loss function. i.e:
loss = Loss(...) # calc loss value according to loss function metric
loss.backward() # calc the gradient
optim.step() # according to the gradient descent algorithm - go one step to the minimum
When dealing with TRIPLETMARGINLOSS (nn.TripletMarginLoss
) we are searching the wights which will maximize this loss function.
So, How can we configure the optimizer (optim.Adam
or optim.SGD
) to find the maximum value and not the minimum value for the loss function ?