Could gpu memory reserved by torch be set or increasedincreased

I have encountered memory problem as below.
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 1024.00 MiB (GPU 0; 6.00 GiB total capacity; 3.60 GiB already allocated; 69.44 MiB free; 3.66 GiB reserved in total by PyTorch

how could i increase the reserved memory. any help is appreciated

You won’t be able to allocate 1GB of memory if ~69MB are only free.
Reduce the batch size or try to lower the memory usage otherwise.