Cloning learning rate scheduler

Hi, I’m looking for a way to clone a learning rate scheduler without re-instantiating an object. I need to use the same scheduler for a different set of parameters, but don’t want to check the type of existing one and create a new one manually. I just want to clone it, but even after trying various methods, no luck yet. Is there such a way?

import torch
import torch.optim as optim
import pickle
import copy

lr = 0.1
opt = optim.SGD([torch.rand((2,2), requires_grad=True)], lr=lr)
sch = optim.lr_scheduler.MultiStepLR(opt, milestones=[1,4,8], gamma=0.5)

sch2 = pickle.loads(pickle.dumps(sch))
sch3 = copy.deepcopy(sch)

torch.save(sch,'/tmp/sch.pt')
sch4 = torch.load('/tmp/sch.pt')

test = 3 # change into 1, 2, 3, 4
for epoch in range(1, 10):
    if test==2:        
        sch2.step()
    elif test==3:
        sch3.step()
    elif test==4:
        sch4.step()
    else:
        sch.step()
        
    print('Epoch-{0} lr: {1}'.format(epoch, opt.param_groups[0]['lr']))
    if epoch % 5 == 0:print()

This is from test=1

Epoch-1 lr: 0.05
Epoch-2 lr: 0.05
Epoch-3 lr: 0.05
Epoch-4 lr: 0.025
Epoch-5 lr: 0.025

Epoch-6 lr: 0.025
Epoch-7 lr: 0.025
Epoch-8 lr: 0.0125
Epoch-9 lr: 0.0125

But, all other test=2,3,4 yield this

Epoch-1 lr: 0.1
Epoch-2 lr: 0.1
Epoch-3 lr: 0.1
Epoch-4 lr: 0.1
Epoch-5 lr: 0.1

Epoch-6 lr: 0.1
Epoch-7 lr: 0.1
Epoch-8 lr: 0.1
Epoch-9 lr: 0.1