Disable TracerWarnings for known constants

Hi everyone,
is there a way to deactivate some TracerWarnings for when I know the input size will not change?

My code is written such that it work with variable input sizes, but actually a single run always uses constant sized input. I do not want to switch to TorchScript because distributions are not properly supported and also because I do not need the functionality, since my inputs are actually constant.

For example the following gives me a warning:

all(s == x.shape[0] for s in x.shape)

TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!

I know why the warning makes sense, however I am wondering whether I can deactivate it since I know that in a single run the condition will be constant.

You could try to set check_trace=False or if that doesn’t help you could try to remove them using warnings.filterwarnings.

1 Like

Thanks. check_trace doesn’t work for me, but I will look into warnings.filterwarnings!

The warning happens when calling torch.onnx.export or when JIT tracing in general. However, you would not want to filter all those warnings for the torch.onnx.export call, but only for some of those specific checks. How would you do that? Using filterwarnings would again also filter all such warnings which is not what you would want. So maybe using it together with warnings.catch_warnings? But is this really the best way? It looks quite ugly.