UnboundLocalError: local variable 'values' referenced before assignment in lr_scheduler

I forgot the code that came out and this is the following code

 model.eval()
 metric,metric2,valid_loss=evalue(model,valid_dl)
 if metric2>best_score:
            state={'state':model.state_dict(),'best_score':metric2}
            torch.save(state,checkpoint_path)
            best_score=metric2
logging.basicConfig(filename='cloud4.log', level=logging.DEBUG, format='%(asctime)s-%(message)s')
logging.warning('epoch_loss:{},metric1:{},metric2:{}'.format(epoch_loss,metric,metric2))