Manually set dropout for each of the samples

Let’s say I have a sparse vector of ones and zeros (NxM) and I would like to have it as a dropout filter on the conv2d layer output which is also (NxM). Is it possible to achieve that by just multiplying it with the output?

Furthermore, is it possible to set a different dropout “mask” for each of the X batch samples? Or I would need to have X backprops by 1 batch sample?

Thank you.

You should be able to multiply the conv output with your custom mask.
If you want to use a different mask for each batch element, your mask should have a batch dimension and different masks for each sample in the batch.
Are you seeing any issues with this approach?

Thank you, I have implemented it this way.

Is there any convenient way of knowing whether the forward is called during training or eval, I would like to apply the dropout mask in between the sequential classes in the forward function.

I guess passing a parameter in forward() is not recommended?

Thank you.

You could check the internal self.training flag inside the module.