Resetting Optimizer Params without resetting learning rate etc

Hello there,

after I optimize an embedding for a while, I change the embedding (i.e. I change the dimensions and values of the embedding tensor by instantiating a new nn.Embedding).
Do I have to reset the params of my optimizer to keep optimizing this updated embedding and if so, how can I do it without creating a whole new optimizer instance, which would reset the adaptive learning rate and the weight decay?

Thank you already for looking into this.

Best regards


You could replace old_embedding.parameters() in optimizer.param_groups[0]['params'] by new_embedding.parameters() to fit your requirement.

1 Like

Yes, thank you, this is what I was looking for.