The thing is that the only parameters that get updated are those passed to the optimizer. The usual method of passing parameters to the optimizer is via mymodel.parameters()
, like this, for example…
optimizer = optim.SGD(mymodel.parameters(), lr=0.01)
You can also add parameters manually using add_param_group
like this…
optimizer.add_param_group({'params': A}) # taking A from your example
By wrapping the LinearCombo in a class, I have ensured that if I use it in my model in the usual way its parameters will be included in mymodel.parameters()
.
That said, if you prefer to use nn.torch.functional
to build your model, then you will have to organise the code a little differently, for an example, see here Are torch.nn.Functional layers learnable?