I am using caffe2 c++ Api, I know how to add an adam operator and usually my nets are working.
It seems that I always need to add a line defining my last gradient with one as a value.
Now, I am trying to implement a BEGAN, and they say that :
“Unlike most generative adversarial network architectures, where we need to update G and D independently, the Boundary Equilibrium GAN has the nice property that we can define a global loss and train the network as a whole (though we still have to make sure to update parameters with respect to the relative loss functions)”
That means I have to minimize two losses independently, but considering that I have one net, how do I do that with caffe2?
Any help would be greatly (greatly) appreciated.