Hello there,
after I optimize an embedding for a while, I change the embedding (i.e. I change the dimensions and values of the embedding tensor by instantiating a new nn.Embedding).
Do I have to reset the params of my optimizer to keep optimizing this updated embedding and if so, how can I do it without creating a whole new optimizer instance, which would reset the adaptive learning rate and the weight decay?
Thank you already for looking into this.
Best regards
Chrixtar