Performance changes after loading the saved model

Hi, currently I’m using BinaryNet provided here. Particularly, I’m working on the vgg_cifar10_binary model on CIFAR10 dataset. I modified the code to have two instances of the model, one for training, the other for testing, like this

model = models.__dict__[args.model]
model_val = models.__dict__[args.model]
model_config = {'input_size': None, 'dataset': 'cifar10'}
model = model(**model_config)
model_val = model_val(**model_config)

In each epoch, I copied the parameters of the training model to the testing model

for epoch in range(args.start_epoch, args.epochs):
    optimizer = adjust_optimizer(optimizer, epoch, regime)
    train_loss, train_prec1, train_prec5 = train(
            train_loader, model, criterion, epoch, optimizer)
    torch.save(model.state_dict(), path)
    model_val.load_state_dict(torch.load(path))
    with torch.no_grad():
         val_loss, val_prec1, val_prec5 = validate(val_loader, model_val, criterion, epoch)

Other parts of the code remain unchanged. I expected it to perform the same as using one model. But it’s not. The validation actually gave correct results at the first epoch, and dropped drastically starting from the 2nd epoch (changing model_val back to model in validate function would make the network function properly).

Did I do anything wrong with the parameter saving & loading?
Thanks!