I am running my training on a server which has 56 CPUs cores. When I train a network PyTorch begins using almost all of them.
I want to limit PyTorch usage to only 8 cores (say). How can I do this?
I am running my training on a server which has 56 CPUs cores. When I train a network PyTorch begins using almost all of them.
I want to limit PyTorch usage to only 8 cores (say). How can I do this?