What is the Learning Rate in the Scheduler with ExponentialLR?

Looking at the docs for the

torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma, last_epoch=-1)

I see the following:

def get_lr(self):
    return [base_lr * self.gamma ** self.last_epoch
            for base_lr in self.base_lrs]

Does this mean that this ExponentialLR Scheduler produces Learning Rates as follows:

last_epoch = 0
lr = 1e-3
gamma = 0.999

for i in range(10):
    lr = lr * gamma ** last_epoch
    print('{:05d}'.format(i + 1), ' ', '{:10.8f}'.format(lr))
    last_epoch += 1

which has the following output:

00001   0.00100000
00002   0.00099900
00003   0.00099700
00004   0.00099401
00005   0.00099004
00006   0.00098510
00007   0.00097921
00008   0.00097237
00009   0.00096462
00010   0.00095598

Am I understanding the ExponentialLR Scheduler correctly?

Thank you.