Variable dropout rate throughout training

How do I set a high dropout rate during the beginning of training, to make weight matrix more sparse, and after every certain epochs, keep reducing this dropout rate?
for example, for the first 50 epochs, dropout rate could be 0.7, for next 50 epochs, 0.6, then 0.5, and for last 50 epochs, dropout rate could be 0.2.

And during evaluation use last dropout rate.

If you are using dropout as a module, you could manipulate the .p attribute after your specified number of epochs:

drop = nn.Dropout(p=0.5)

x = torch.randn(10)
out = drop(x)
print((out==0.).sum())
> tensor(5)

drop.p = 0.2
out = drop(x)
print((out==0.).sum())
> tensor(1)

In your case, the manipulation would of course look like this:

model.drop1.p = 0.2

Or alternatively you could also use the functional API from the beginning, and provide the drop probability in the forward method of your model:

def forward(self, x, p):
    x = F.dropout(x, p, training=self.training)
    return x