No x86_64-linux-gnu-g++ version bounds defined for CUDA version

Hi,
I’m trying to build flash attention from source and I get this warning from pytorch:

venv/lib/python3.11/site-packages/torch/utils/cpp_extension.py:490: UserWarning: There are no x86_64-linux-gnu-g++ version bounds defined for CUDA version 12.8

How can I solve that ?