How to interpret (understand) this OOM information

I am confused with the error message. It seems that the machine owns 23.70GiB memory, and it is enough for 30.00 MiB. Why thrown error information below? Could you help me clarify the reson? Thnank you !

RuntimeError: CUDA out of memory. Tried to allocate 30.00 MiB (GPU 0; 23.70 GiB total capacity; 1.60 GiB already allocated; 13.03 GiB free; 1.66 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

This error seems to be indeed at least misleading since your device has >13GiB of free memory and I doubt it’s all fragmented. In any case, could you post a minimal and executable code snippet so that I could try to reproduce and debug the issue, please?

Sorry, but I can not provide you with a desirable code. Because the original code is very large and complicated . And I guess that the multiprocessing library of pytorch utilzed by the code may be related to this error.