Memory issues even after setting memory_conf

I am currently getting out of memory errors. And I have too much memory reserved comparesdto what has actually been allocated.

(GPU 0; 23.70 GiB total capacity; 15.52 GiB already allocated; 1.29 GiB free; 20.33 GiB reserved in total by PyTorch)

I am setting the values for PYTORCH_CUDA_ALLOC_CONF with


os.environ["PYTORCH_CUDA_ALLOC_CONF"] = 'max_split_size_mb:10000,garbage_collection_threshold:0.7'

But it does not seem to be changing anything.