Problem with tensorboard epochs

I have a for loop and in every epoch, I add some scalars by using writer.add_scalar, but when I open tensorboard, the number of epochs drawn in tensorboard is not the same as loop epochs. For example, in a loop of 100 epochs, it draws 90 epochs. The add_scalar is run in every epoch (I printed the epoch number), I have absolutely no idea what I should check to find the problem.