Repeated use of autograd.Function subclass

I’m a beginner. If I derive a subclass MyFunc of autograd.Function, and then want to create a chain of independent objects of type MyFunc, how is this done, given that the tutorial covering this (https://pytorch.org/tutorials/beginner/pytorch_with_examples.html#pytorch-defining-new-autograd-functions) shows that the subclass forward / backward methods are static, and shows invocation of base class method apply without any creation of objects. In my setup I want to chain multiple MyFunc objects together. For each iteration the entire chain needs to run, beginning to end, then backpropagation would run in the reverse direction. I definitely need each object of type MyFunc to have independent context.

Hi,

Function object are designed to be elementary blocks that are not aware of anything else.
To combine Functions easily, the right thing to do is to write and nn.Module. Or just a python function, that will call the .apply of your Functions one by one in the right order.
The autograd engine will take care of calling them for the backward pass.