LambdaLR scheduler steps once it is instantiated

The following code is self-explanatory.

model = nn.Linear(2, 3)

opt = torch.optim.Adam(model.parameters(), lr=1.0)

print("Before instantiation of lr scheduler", opt.param_groups[0]['lr'])

lrs = torch.optim.lr_scheduler.LambdaLR(opt, lr_lambda = lambda epoch: (epoch+1) * 0.5)

print("After instantiation of lr scheduler", opt.param_groups[0]['lr'])

print("After instantiation of lr scheduler, accessed through the method in lr_scheduler", lrs.get_last_lr())

The results I obtain are:

Before instantiation of lr scheduler 1.0
After instantiation of lr scheduler 0.5
After instantiation of lr scheduler, accessed through the method in lr_scheduler [0.5]

I am wondering why the learning rate got updated even before the call on step method of lr_scheduler.