I am wondering if I have a two neural networks, netA, and netB. and I have some output from
y_A = netA(X)
y_B = netB(X)
and I want to use the loss calculated from model A, such as
loss = Loss(y_A, y)
but then I want to use this loss to update my model netB.
Because I am coming to this from a pure computational perspective, such as linear regression we can differentiate the mean squared error in terms of the parameters to get some values. and the backpropagation with respect to linear algebra is a dot product between the input of the layer and the output of the layer.