After training, how can all the weights that are dropped out be?

If dropout is applied during training, I know that dropout rate should be set to 0 through model.eval() when evaluating. Then, are the droppedout weights given random values ​​when evaluated? If so, I think the performance will be worse. Can’t we keep dropout even when we evaluate?

Hi,

Since dropout is random every time it is used, there are not a fixed set of weights that are droppedout. They are all used sometimes during training and so have been trained properly.

1 Like

Hi,

Thank you so much!