When you are creating a model via nn.Modules, all modules with trainable parameters will initialize nn.Parameters internally, which require gradients by design.
You can check these via print(dict(model.named_parameters()). This approach is used in the first example.
However, you donβt necessarily need to use modules and can also write your code in a pure functional way. In this case you could create trainable parameters by creating tensors with requires_grad=True yourself.