I have a model and an optimizer, to which I apply amp from apex package
from apex import amp model= ... optimizer= ... model, optimizer = amp.initialize(model, optimizer, opt_level='O1')
After the model have been wrapped in amp, I would like to access one of its weights and change it. For example:
model.conv1.weight.data = new_tensor
The point is when I do this, it has no effect. It looks like amp keeps a different copy of the weights, and when updating the weight on the fly, there is no effect.
Is there any possibility to update the weights on the flight after my model has been wrapped by amp?