How to save a list of integers for backward when using CPP custom layer?

For a custom layer like that:


class CustomLayer(Function):
    @staticmethod
    def forward(ctx, input, int1, int2):
        ctx.save_for_backward(input, int1, int2)
        out = some_c_function(input, int1, int2)
        return out

    @staticmethod
    def backward(ctx, grad_output):
        input, int1, int2 = ctx.saved_tensors
        grad_input = torch.zeros_like(input)
        backward_func(input, grad_output, grad_input, int1, int2)
        return grad_input, None, None


But when I call this layer, error occurred:

TypeError: save_for_backward can only save variables, but argument 3 is of type int

Now the question is, how can I save those integers for backward ?

5 Likes

Hi,

You need to use save_for_backward only for inputs and outputs.
All other tensors or python objects can be saved in the ctx as ctx.in1 = int1.

16 Likes

@albanD why do we need to use save_for_backwards for input tensors only ? I just tried to pass one input tensor from forward() to backward() using ctx.tensor = inputTensor in forward() and inputTensor = ctx.tensor in backward() and it seemed to work.

I appreciate your answer since I’m currently trying to really understand when to use ctx rather than save_for_backward . Another post tells us memory leak isn’t an issue any more , and concludes on my question.

In addition, reading this other post suggests “earlier torch versions it was not possible to save_for_backward an intermediate result, only inputs to the forward function, but that seems to be no longer the case.”

An actual limit would be the need for the saved value to be a variable (in the sense of pytorch tensor) in the case of save_for_backward .

Oh! PyTorch internals time!

So what happens is that save_for_backward stores the tensors in SavedVariables. This is what does the sanity checks for the backward (one of the variables needed for gradient computation has been modified by an inplace operation being the perhaps most (in)famous).
Now variables that aren’t input or output of your function (and don’t share storage…) are not visible to the outside world and thus cannot be changed in ways that would trigger those sanity checks.
Now you can safely store other tensors, but SavedVariable won’t work for non-tensors.

Best regards

Thomas

5 Likes

I see, sanity checks. Thanks ! :smile: