Thanks for your reply.
Last night,I used the model to train.I didn’t got this error,but,the Test Loss does not go down.
And I print the scheduler.get_lr(),I got a [value].
if your test loss doesn’t go down, then that’s a completely different situation. You’re probably overfitting at that point. You can confirm that by checking performance against your evaluation metrics and then maybe using early stopping or something along those lines.
Thanks for your reply.I have done it.