How to limit the number of CPUs used by PyTorch?

I am running my training on a server which has 56 CPUs cores. When I train a network PyTorch begins using almost all of them.

I want to limit PyTorch usage to only 8 cores (say). How can I do this?


You can use torch.set_num_threads() to do this.

1 Like