Online hard negative mining vs dropout

(Lolong) #1

This question is for a confirmation and asking for best practices in the online mining problem.

For online mining, as you know, you have to forward pass your input samples through the network in order to extract features and find the samples that contribute in you loss function. These features can be done with model.train(True) or model.train(False). In both cases, you will get the same features as long as you do not have dropout layer in the network. Here my questions:

  1. If my network has dropout, will I forward pass the input samples x2 times to the network?
    • one for feature extract with model.train(False) and create the mined samples
    • another for forwarding pass these mined samples with model.train(True)
  2. in the case of yes in the previous question:
    • since droupout is used for a kind of regularization, but we need to forward pass x2 times (higher computation time), is it a good idead to use this layer?
    • or do you think that removing dropout layer is still fine, but also not need to forward pass x2 times?