# Loss function depends on the output gradients

I would like calculate the loss in the following form:

where u_bc and \hat{u}_bc are the predicted and exact values of x_1, u’‘_r and \hat{u}’'_r are the predicted and exact second derivatives of the output from x_2. x_1 and x_2 are different samples.
I am trying to implement in the following way:

# forward pass to calculate the first loss component
u_bc_pred = self.forward(self.X_u)
loss_bcs = self.loss_fnc(u_bc_pred, self.Y_u)

# the second loss component involves the second derivative of output
u_r_pred = self.forward(self.X_r)
loss_res = self.loss_fnc(u_xx, self.Y_r)

# Total loss
loss = loss_res + loss_bcs
self.loss_log.append(loss.data)

# backpropagation