Currently, checkpoint function allows only Tensor/Variable output. But some architectures might have more than one output, such as the encoder of a standard FCNs, which might have several outputs from the mainstream and skip connection, or architectures that have both text and label outputs.

Now, my solution is to concatenate the tensors in the function and unroll them outside checkpoint, which is not so convenient.

So are there any neat way to do that? Or would it be list supports in the future?

Iâ€™m not sure to understand, I though checkpoint was takes an arbitrary number of input tensor and output an arbitrary number of output tensors. Doesnâ€™t that work? Are you returning a list containing Tensor or multiple tensor from the python function?

What could one do, if a custom layer (implemented by deriving from autograd.Function) has a forward function that has to return a set of tensors, where the number of tensors depends on an input parameter. List of tensors could help but it looks like list as output is not supported. I get the following error: â€śTypeError: ripsLayerBackward.forward: expected Variable (got list) for return value 0â€ť, where ripsLayer is the name of my layer.

You can return as many Tensors as you want (it can be different from one forward to the next actually). You just need to make sure that your backward will handle this variable number of grad output.

The output of your function should be a bunch of Tensors. Tuples are a special construct in python in the sense that when you do return a, b you actually return a tuple containing a and b.
So I guess it should work yes.