Change weights of a model wrapped in amp

Hi.

I have a model and an optimizer, to which I apply amp from apex package

from apex import amp
model= ...
optimizer= ...
model, optimizer = amp.initialize(model, optimizer, opt_level='O1')

After the model have been wrapped in amp, I would like to access one of its weights and change it. For example:

model.conv1.weight.data = new_tensor 

The point is when I do this, it has no effect. It looks like amp keeps a different copy of the weights, and when updating the weight on the fly, there is no effect.

Is there any possibility to update the weights on the flight after my model has been wrapped by amp?

Thanks

Anyone? I have tried to reinitilize the amp wrapper, but it is not advised by the amp documentation

I would not recommend to use apex/amp anymore, but to switch to the native implementation as described here.
One reason is the added flexibility for such use cases.

Thanks a lot!! I will do this