Limit Cores used during training of word2vec Model

Dear community,

I have access to a machine with 64 cores. On this machine I am supposed to train my model using jupyter lab.

The problem is that I am only allowed to utilize 8 of the 64 available cores in order to not disturb the other people who I am sharing the machine with.

May I ask what would be the go-to, best practise approach to limit the amount of cores used?

I believe to understand that pytorch.multiprocessing Pool could allow me to set the number of processes. But is there a better way?

I would love to get your input.

Yours,

Tridelt