Current epoch access in Learning rate scheduler

I want to customize the learning rate scheduler for 20 epochs. For first 10 epochs constant learning rate and for last 10 epochs exponential decreasing rate. I want something like below in “get_scheduler(optimizer,opt)”.

 if(epoch<11):
        scheduler = lr_scheduler.LambdaLR(optimizer, lr_lambda=1)
 else:        
        scheduler = lr_scheduler.ExponentialLR(optimizer, gamma=0.9)  

In “get_scheduler(optimizer,opt):” i don’t have access of current epoch but when i use function “def lambda_rule(epoch)” in “scheduler = lr_scheduler.LambdaLR(optimizer, lr_lambda=lambda_rule)” i have excess of current epoch as below code.

def lambda_rule(epoch):
   lr_l = 1.0
   return lr_l
scheduler = lr_scheduler.LambdaLR(optimizer, lr_lambda=lambda_rule)

After every epoch, scheduler.step() is calling. How can i access current epoch is in “get_scheduler”? and what is the reason that i have access of current epoch in lambda_rule(epoch) even i am not passing “epoch” during call.

Second issue is, when “lr_scheduler.LambdaLR(optimizer, lr_lambda=lambda_rule)” called it can return float value 1 but when calling "lr_scheduler.ExponentialLR(optimizer, gamma=lambda_rule) and return 0.9 it shows below error.

“TypeError: unsupported operand type(s) for *: ‘float’ and ‘function’”