Larger batchsize, less training time for an epoch?

Larger batchSize means less time to complete an epoch. Is this right?
But in practice, when I set batchSize = 1, the time for an epoch is 108 seconds. When I set batchSize = 32, it’s 116s, and 142s for batchSize = 64.
I am very confused by the results…