Is it mandatory to add modules to ModuleList to access its parameters

You don’t need an nn.ModuleList to register the parameters properly.
Could you link the topic, where this is stated, please?

The parameters are registered with the assignment in your model:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.fc = nn.Linear(1, 1) # here all parameters of the linear layers are registered
        self.register_buffer('my_buffer', torch.tensor(1)) # here the buffer is registered

The __setattr__ method in nn.Module will take care of it.

Note that plain tensors (not nn.Parameters, nn.Modules or buffers) will not be registered and will thus not be returned in model.parameters() or model.buffers().