Torch.compile and Parameter replacement in a module

Hi everyone,

I have a case where I need to manualy replace some parameters of a custom nn.Module at regular point during training.
This looks essentially like this

class Custom(torch.nn.Module):
    def __init__(self, ...):
        self.coef =  torch.nn.Parameter( ... initial values ...)

def updateCoef(module):
     module.coef = torch.nn.Parameter( someCalculation(module.coef) )

The function updateCoef is called when the parameter replacement is needed.

My question is what if I use model.compile() before the training ? Will the update of the parameter matters with respect to what is compiled ?
I’ve done some limited testing and it seems there is some lag after the update, as if a recompilation was automatically triggered… but I’d like to be sure I’m not doing something plain wrong.

Thanks for any hints !