Initialize parameters in a torch.autograd.Function

Hi,

No this is not possible.
We don’t really use Function as class but more as a convenient way to store the pair of functions for forward and backward.
You can pass these arguments directly to the forward though. (and return None for their grad in the backward).

2 Likes