Geometric-Aware loss function

I am trying to implement the Geometric-Aware loss function outline in RenderForCNN (https://arxiv.org/pdf/1505.05641.pdf) whose caffe implementation can be found here (https://github.com/charlesq34/caffe-render-for-cnn/blob/view_prediction/src/caffe/layers/softmax_view_loss_layer.cpp).

The problem is that since the output is being done to multiple labels at the same time, one has to only backprop to a specific number of parameters, this is done by assigning specific weights to each gradient, with many of them being 0 (lines 139 and 177-187 in file linked above). Is there a way to implement a similar fuction in PyTorch ? if so, any advice on how to do this ? Thank you!

I had to implement my own loss function which specified custom gradients as well by normalizing some regularization term’s gradients (this case the non-informative prior for geodesic distance + L1 loss of the in-plane rotation) accordingly to the softmax conditioned outputs.

You can specify such custom forward and backwards procedure using PyTorch Functions or Modules.

http://pytorch.org/docs/master/notes/extending.html demonstrates how you can attain a gradient input and manipulate it in this case to replicate the backwards procedure shown in the Caffe code.

If your loss simply requires functional differentiation, then you can just create a nn.Module and have the auto-diff handle it for you :).

An example of it is available in some of my bundle of code here for a structured Mahalanobis metric loss.

1 Like