Torch.compile() verbose logging

Hi, I have been trying to train some fairseq models with pytorch2.0 and added torch.compile to the code. While training, I get a screen full of verbose torch._inductor and torch._dynamo logging statements like the following. Is there any way to quiet them or turn them off?

[2023-03-23 19:51:25,748] torch._inductor.utils: [INFO] using triton random, expect difference from eager
[2023-03-23 19:51:25,839] torch._inductor.compile_fx: [INFO] Step 3: torchinductor done compiling FORWARDS graph 40
[2023-03-23 19:51:25,840] torch._dynamo.output_graph: [INFO] Step 2: done compiler function debug_wrapper
[2023-03-23 19:51:25,849] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo start tracing <graph break in forward>
[2023-03-23 19:51:25,886] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo start tracing forward
[2023-03-23 19:51:25,922] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo start tracing extract_features
[2023-03-23 19:51:25,958] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo start tracing extract_features_scriptable
[2023-03-23 19:51:25,984] torch._dynamo.output_graph: [INFO] Step 2: calling compiler function debug_wrapper
[2023-03-23 19:51:26,076] torch._inductor.compile_fx: [INFO] Step 3: torchinductor compiling FORWARDS graph 41
[2023-03-23 19:51:26,079] torch._inductor.graph: [INFO] Using FallbackKernel: aten.cumsum
[2023-03-23 19:51:26,090] torch._inductor.utils: [INFO] using triton random, expect difference from eager
[2023-03-23 19:51:26,302] torch._inductor.compile_fx: [INFO] Step 3: torchinductor done compiling FORWARDS graph 41
[2023-03-23 19:51:26,302] torch._dynamo.output_graph: [INFO] Step 2: done compiler function debug_wrapper
[2023-03-23 19:51:26,309] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo start tracing <graph break in extract_features_scriptable>
[2023-03-23 19:51:27,139] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo done tracing <graph break in extract_features_scriptable> (RETURN_VALUE)
[2023-03-23 19:51:27,152] torch._dynamo.output_graph: [INFO] Step 2: calling compiler function debug_wrapper
[2023-03-23 19:51:31,261] torch._inductor.compile_fx: [INFO] Step 3: torchinductor compiling FORWARDS graph 42
[2023-03-23 19:51:31,388] torch._inductor.utils: [INFO] using triton random, expect difference from eager
[2023-03-23 19:51:34,089] torch._inductor.compile_fx: [INFO] Step 3: torchinductor done compiling FORWARDS graph 42
[2023-03-23 19:51:34,090] torch._dynamo.output_graph: [INFO] Step 2: done compiler function debug_wrapper
[2023-03-23 19:51:55,753] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo start tracing <graph break in forward>
[2023-03-23 19:51:55,756] torch._dynamo.symbolic_convert: [INFO] Step 1: torchdynamo done tracing <graph break in forward> (RETURN_VALUE)
[2023-03-23 19:51:55,758] torch._dynamo.output_graph: [INFO] Step 2: calling compiler function debug_wrapper
[2023-03-23 19:51:55,777] torch._inductor.compile_fx: [INFO] Step 3: torchinductor compiling FORWARDS graph 43
[2023-03-23 19:51:55,856] torch._inductor.compile_fx: [INFO] Step 3: torchinductor done compiling FORWARDS graph 43
[2023-03-23 19:51:55,856] torch._dynamo.output_graph: [INFO] Step 2: done compiler function debug_wrapper
[2023-03-23 19:51:55,863] torch._inductor.compile_fx: [INFO] Step 3: torchinductor compiling BACKWARDS graph 43
[2023-03-23 19:51:55,867] torch._inductor.compile_fx: [INFO] Step 3: torchinductor done compiling BACKWARDS graph 43
[2023-03-23 19:51:55,904] torch._inductor.compile_fx: [INFO] Step 3: torchinductor compiling BACKWARDS graph 42
[2023-03-23 19:51:59,590] torch._inductor.compile_fx: [INFO] Step 3: torchinductor done compiling BACKWARDS graph 42

This seems to be related to this topic which already tracks the request and links to a PR which should fix the issue.