In my code, the optimizer updates the parameters after a training step. These parameters are then used at the start of the forward pass to generate weights for my model to use. The generation is done every training step, and the first step of eval.
I understand that I could use
forward_pre_hook to accomplish what I am describing. However, updating during that first eval step does not look very nice to me, I use a flag in my model to keep track of that update.
I believe that having my nn.Module call a hook after each optimizer step would be more convenient, for me and everyone else that may want to optimize parameters for use in generation of weights or filters or what not.
Does such a
optimizer_step_post_hook exist? Is it possible to implement such a hook?