Torch.compile triggers anomaly detection

I’m setting trainer.detect_anomaly to False, but when switching from eager training to torch.compile with the default backend, I receive the following warning:

UserWarning: Anomaly Detection has been enabled. This mode will increase the runtime and should only be enabled for debugging.

Has anyone else encountered this?

I haven’t been seeing this locally. Does this repro for you on a recent nightly?