Hi all.
A strange thing happened.
I am using Pytorch on a shared PC and the CPU usage was very high and monopolizing resources during machine learning.
So I limited the number of CPU cores used (72 β 8) and the CPU utilization dropped as expected compared to before the limitation. And strangely enough, the GPU utilization became higher, and the time for learning was also greatly reduced.
This is a departure from my expectation that the more cores on the CPU, the better.
What could be the cause of this phenomenon?
Best regards.