Autograd Function vs nn.Module?

Hi,

This post Difference of methods between torch.nn and functional should answer most of your questions.

2: I would say nn.Module since you have parameters
3: You need to specify the backward function if you implement a Function because it works with Tensors. On the other hand, nn.Module work with Variable and thus are differentiated with autograd.

1 Like