Is it normal that `tanh` doesnt use all cores?

When I run this script:

import torch

a = torch.rand(1000, 10000)
while True:
    print('.')
    a.tanh_()

and then open htop, I expect to see all 8 cores running at 100%, but only 4 seem to be running? :

Thoughts?

[edit: also, the 4 cores that are running are not exactly running at 100% either]

Is it always cores 1,3,5,7 that are activated, or is it changing?

It is possible that your ‘a’ is allocated on 4 of the cores, so the function process a loop into a first core (for a first par of ‘a’), then a second etc… Each time, the corresponding core is active at 100%, but over a shirt time. If htop is showing a moving average of the activity, it explains you have 4 cores working at 1/4.