Model inference time count not correct

I want to get my model inference time by this code:

    with torch.no_grad() :
        input = np.random.rand(1, 3, 368, 368).astype(np.float32)
        input = torch.from_numpy(input)
        start = time.time()
        for i in range(100):
            t1 = time.clock()
            _, _ = net(input)
            t2 = time.clock()
            print('every_time: %04d: '%i, t2 - t1)
        end = time.time()
        print('total time: ', end - start)

the print result is:

every_time: 0000:  0.37265799999999993
every_time: 0001:  0.32706800000000014
.
.
.
every_time: 0098:  0.32011200000000173
every_time: 0099:  0.3260919999999956

total time:  8.159255981445312

for every_time, it about 0.3~0.4, so the total time should be (0.3~0.4)*100=(30~40), but according to total time , it’s about 8.16. Actually, in my opinion, 8.16 is the correct time. So, why every_time plus 100 not equals total time?

you are not measuring the same thing. see https://stackoverflow.com/questions/85451/pythons-time-clock-vs-time-time-accuracy