Call a backward for a module without a forward?

dear all,

For some reason, I would need to compute the back-propagation for some module with a known value for the gradient.

In other words, I want to call backward for some module, without previously calling a forward.

For instance, on a linear module with weights W and no bias such that y=Wx, this backward pass amounts to multiplication by W.t().

is it possible to do that kind of things?

Hi,

It’s not possible in general because some modules actually require the forward pass to be able to do the backward.
For the case of the linear, if you want the gradients wrt the weights, then you need the forward to have save the input tensor to be able to compute the gradients.

Of course in some special cases it’s possible as the one you showed. But I’m afraid you’ll have to write the mm yourself.

ok, thanks for the fast answer!