# Forward and derivative functions for error criterion

I have some functions that require storing error values at a current layer as they are passed through. For example if the nonlinearity of a layer is sigmoid the line of code looks like:

``````grad_output.detach()*(Value + sigmoidPrime(F.sigmoid(self.totalOut)))
``````

and I have a defined

``````def sigmoidPrime(x):
return x * (1-x)
``````

I am looking to do the same thing for the top layer which would require functions for the actual forward and backward steps of what is happening within loss functions. Questions 1 is if there is any generic way to do this or a place I can look within the pytoch definitions for these functions. 2 is, if not, if anyone can help me write what these might look like for some common loss functions like MSE and CrossEntropy?

``````def crossEntropyForward(x):
???
def crossEntropyPrime(x):
???
``````

Hi,

I’m not sure what `sigmoidPrime` is supposed to represent here?
Also not sure what the `grad_output` is in your formula above. Can you share more context?

Ah yes sorry, this code is in a register backward hook function

``````layer.register_backward_hook(saveAverageD)
...
I’m still unsure what your goal here is For example, you have mse that does this in the forward: `mse(x, y) = (x - y).norm(2).mean()`
Then you can differentiate wrt to x and do `mse_backward_x(grad_out, x, y) = 2/N * grad_out.expand_as(x) * (x - y)`