How to used Adam and LBFGS optimizers in one single neural network

Dear community,

I want to use two optimizers for my model. First I want to use Adam and then LBFGS. This is my code for training the network:

layers = np.array([2, 20, 20, 1])
PINN = FCN(layers).to(device)
optimizer = torch.optim.LBFGS(PINN.parameters(), max_iter=20, lr=0.001)
def closure():
    optimizer.zero_grad()
    loss = PINN.loss(left_x[idx_l,:], left_y[idx_l,:], right_x[idx_r,:], right_y[idx_r,:], 
                     bottom_x[idx_b,:], bottom_y[idx_b,:], X_train_Nf)
    loss.backward()
    return loss
for i in range(1000):
    loss = optimizer.step(closure)
    with torch.no_grad():
        test_loss_l = PINN.lossBC_l(left_x, left_y.flatten().view(-1,1))
        if (i+1)%250 == 0:
            print('training:', loss.cpu().detach().numpy(), '/ Testing', test_loss_l.cpu().detach().numpy())

Then, how can I modify it in a way that it uses Adam for training and LBFGS for the test steps.
In advance I very much appreciate any help.

Cheers
Ali