Using same dropout object for multiple drop-out layers?

Excuse me for jumping into the conversation, but so this was one of the bits of style advice in a tutorial I did earlier this week.

While it would technically work for vanilla PyTorch use, I would consider it bad advice to re-use layers. This includes ReLU and Dropout.

My style advice is to use the functional interface when you don’t want state, and instantiate an one object per use-case for if you do.

The reason for this is that it causes more confusion than benefits.

  • It looks funny in printing.
  • It may screw up other analysis tools.
  • When you do advanced things, e.g. quantization, suddenly ReLU is stateful because it captures data for quantization.

Best regards

Thomas

12 Likes