Hi, I just wonder what is different between nn.Dropout and Dropout1d? which one should I use for normal Linear layers? Thank you
nn.Dropout1d
will zero out entire “channels”, i.e. values in dim1
. These activations are usually the provided by nn.Conv1d
modules (or other “temporal” layers).
For standard nn.Linear
layers working on an input in the shape [batch_size, in_features]
(i.e. without a “temporal” dimension), nn.Dropout
would be the common choice.
1 Like