torch.cuda.empty_cache()
For the multiple GPUs, it stucks at this line forever - at that time there is no GPU usage, but has CPU usage.
Does anyone have any idea about it?
torch.cuda.empty_cache()
For the multiple GPUs, it stucks at this line forever - at that time there is no GPU usage, but has CPU usage.
Does anyone have any idea about it?