About torch.cuda.empty_cache()

Well if you run in a terminal, it is already freed when you start the program. So you already have it working fine :slight_smile:
You might want to check that you don’t have other programs running that use up GPU memory though.
But otherwise, there is not much gain you can get here. You’ll most likely will have to reduce the network size or batch size if it does not fit in memory :confused:

1 Like