Triton random warning

Every execution of my function decorated with @dynamo.optimize("inductor") I am getting the following warning:
torch._inductor.utils: [WARNING] using triton random, expect difference from eager

I am ok with differences due to a faster RNG, however I would love if this warning could be suppressed. The API for warnings seems to have changed a lot since I last used Pytorch, I was hoping for some help.


torch._inductor.config.fallback_random = True

Which I assume disables the more efficient RNG, does not work anymore.

Ah this seems annoying could you please open an issue so we can track Issues · pytorch/pytorch · GitHub