I am new to ML and pytorch, so probably this is a dumb question.
I saw some posts to make a custom mask for dropout, but in my case, the mask depends on the input: drop out the data in a certain range, drop out max value, or drop out min value, etc. just to mention a few examples.
Since the mask-making process is not differentiable in general, I do not want the mask generation be a part of the backpropagation. So in forward pass, I do the mask generation based on the inputs and apply the mask. And in backprop, I only do the backprop for the mask apply without mask generation. Can I do this in pytorch? If so, how can I do this?