Autograd Functions: non-tensor arguments

questions

  1. can i pass non-tensor arguments to autograd forward/backward?
  2. can i save non-tensor arguments with ctx.save_for_backward?

background info

In the documentation for torch.autograd.Function.backward it is stated that:

It must accept a context ctx as the first argument, followed by as many outputs did forward() return, and it should return as many tensors, as there were inputs to forward() . Each argument is the gradient w.r.t the given output, and each returned value should be the gradient w.r.t. the corresponding input.

I need 2 arguments to be passed from the module variables to the autograd function and into the kernel, an integer and a boolean. From the above documentation it sounds like this will not work with backward as those aren’t tensors. (I don’t wanna make them tensors either)

Currently my forward looks like:

class modulefunction (autograd.Function):

    @staticmethod
    def forward (ctx,*args):

        output,*variables = _cuda_.module(args)
        ctx.save_for_backward(variables)

        return output

    @staticmethod
    def backward (ctx,d_output):

        return _cuda_.dmodule(d_output,*ctx.saved_variables)
    

class module (nn.Module):

    ...

    def forward (self,input):

        ...

        return modulefunction.apply(input,*other_tensors,integer,boolean)

Do i just have the cpp backward binding return placeholder values like 0 for those extra arguments?

3 Likes