How to compute the mount of data transferred and the time during the training?

In Federated learning during the training, I want to measure the communication time for transferring data from clients to the server. How to compute the amount of data transferred (byte) and the time it takes in each round during the training?

I use tqdm() (documentation here tqdm · PyPI) in order to show the progress bar with time and unit as:

for i in tqdm((range(num_selected)), unit='B', unit_divisor=1000, unit_scale=True):
        loss += client_update(client_models[i], opt[i], train_loader[client_idx[i]], epoch=epochs)

The output that I got is:

100%|██████████| 30.0/30.0 [00:45<00:00, 1.50s/B]
0-th round
average train loss 1.78 | test loss 0.0279 | test acc: 0.278
100%|██████████| 30.0/30.0 [00:44<00:00, 1.48s/B]
1-th round

t is not clear to me if the data is transferring 1.50 Byte every second or Byte every 1.50 second?

even when I change unit='kB' the output is the same except the letter B is changed to kB with the same amount of the data!

How to use tqdm() to show the real amount of transferred byte in each round?