How can I use Lamba layer that can force a percentage of weights from a layer to zero during test and train once chosen to be dropped?
I switched to pytorch a couple of days ago and a detailed explanation will be very helpful. Please find the Keras script I was using(below), using Lamba and Dropout for the same.
class PermanentDropout(Dropout):
def init(self, rate, **kwargs):
super(PermanentDropout, self).init(rate, **kwargs)def build(self, input_shape): self.uses_learning_phase = False super(PermanentDropout, self).build() def call(self, x, mask=None): if 0. < self.rate < 1.: noise_shape = self._get_noise_shape(x) x=Lambda(lambda x: K.dropout(x,self.rate, noise_shape))(x) return x