Manipulating gradients in backward

I’m trying to extend Function as you’ve suggested but I can’t seem to make it work.
It seems that I have two main problems:

  1. I cannot save the output variable using ctx.save_for_backward as I get

RuntimeError: save_for_backward can only save tensors, but argument 1 is of type Variable

Whether I’m saving output or output.data

  1. When I’m only saving the input data the autograd doesn’t enter the backward function when doing backward.

Any ideas where I’m going wrong here?