Calling torch.compile inside the training loop

Hello!

I had a question regarding how one should use torch.compile inside the training loop.
Currently, I am calling torch.compile(model) for every iteration of the training loop.

For example, it is currently something like this (PS. I am actually training a diffusion model):

for (img, label) in data:
    model = torch.compile(model)  #  <- torch.compile is going to be called multiple times
    output = model(img)
    loss = mse_loss(output, label)
    ...
    (rest of the code)

Is this the correct way of using it, though? Would it be okay to call torch.compile multiple times like this?

The docs do not contain any information about this, so I am posting the question here.

Thank you so much!
Best regards

No, I think calling it once outside the training loop is the right approach.
CC @marksaroufim in case there is a use case for multiple calls

You can actually compile the entire training loop, otherwise I typically recommend compiling the model the first time outside of the training loop