Integrating Lamba layer and Dropout during testing

How can I use Lamba layer that can force a percentage of weights from a layer to zero during test and train once chosen to be dropped?

I switched to pytorch a couple of days ago and a detailed explanation will be very helpful. Please find the Keras script I was using(below), using Lamba and Dropout for the same.

class PermanentDropout(Dropout):
def init(self, rate, **kwargs):
super(PermanentDropout, self).init(rate, **kwargs)

def build(self, input_shape):
    self.uses_learning_phase = False
    super(PermanentDropout, self).build()
def call(self, x, mask=None):
    if 0. < self.rate < 1.:
        noise_shape = self._get_noise_shape(x)
        x=Lambda(lambda x: K.dropout(x,self.rate, noise_shape))(x)
        return x

As far as I understand your code, you are basically using a Dropout layer given a special flag which you defined.
If that’s the case, you can just use the functional dropout in your forward method and use the training argument to activate or deactivate it.

1 Like