hi guys, i am new to pytorch and i was wondering what will torch.cuda.empty_cache() do? and where should i add it? for example,
for epoch in range(epochs): train(......) torch.cuda.empty_cache() val(....)
or
for epoch in range(epochs): train(......) val(....) torch.cuda.empty_cache()
Hi,
This is a low level function to finely control some of the internals. You don’t need to use it as it will slow down your code for no gain !
really? many students think it is a method to speed up training??
It’s not, as it will synchronize your code and free all cached memory. Since no memory is in the cache, the next allocations will again synchronize your code during the cudaMalloc calls and thus cause slowdowns in your code.
cudaMalloc