Can I train a model with dropout and apply the checkpoint to a model without the dorpout layers?

I understand that after training a model with dropout layers, you set it to net.eval() for inference.

But I thought there should be a way to apply the learned weights (that is, the checkpoint) from the training to another model which is identical to the original trained model, but without the dropout layers.

If anyone let me know, I would appreciate that.

Do you want to remove the dropout for each of the training checkpoints? Then just do net.eval() for the checkpoint, and net.train() at the beginning of each step of your training loop (in order to restore dropouts).

1 Like