Why use Parameter if tensor by default has grad on?

Is it just to add it to the modules list of Params? So that is it backpropagated when you add module.params to optimizer?

Hi,

By default the grad_mode is on, but the Tensors you create do not require gradients.
Also it is indeed used to automatically add to the list of parameters of the module.