Should I add empty_cache() after each batch?

Hello everyone.

I try training my model however I met an OOM problem, I tried to reduce the batch and I want to add empty_cache() after each batch. Here is my question:
If i add torch.cuda.empty_cache() function after each batches, will it affect to ability to evaluate model?
Will model weights, loss keep updated if I add empty_cache() function

Calling empty_cache() will free the unused memory from the internal caching allocator and make is usable for other processes. You will not avoid OOM issues besides slowing down your code as the memory would have to be re-allocated with synchronizing cudaMalloc calls in the next iteration.
Besides that no effects on e.g. Autograd etc. will be visible.

1 Like