About Dropout random and How to disenable the Dropout in train step?

Firstly, I found the training log was changed when I applied the dropout layer. I ensured other random was fixed by setting rand seed. How can I fix the dropout random?
Secondly, when I want to enable dropout after net.eval(), I can use net.apply(apply_dropout). The apply_dropout function is:

def apply_dropout(m):
    if type(m) == nn.Dropout:
        m.train()

However, when I wanted to disable the dropout layer in the training step using inverse operation, it did not work. Specifically, I used net.apply(de_apply_dropout) after net.train(). The de_apply_dropout function is:

def de_apply_dropout(m):
    if type(m) == nn.Dropout:
        m.eval()

The purpose of the above operation is that I want to enable the dropout layer in some epochs and disable it in other epochs when the network is trained. How can I solve it?
Thanks in advance.

Your code seems to work correctly and I can use it to enable or disable the dropout layer:

def apply_dropout(m):
    if type(m) == nn.Dropout:
        m.train()

def de_apply_dropout(m):
    if type(m) == nn.Dropout:
        m.eval()

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.lin = nn.Linear(10, 10)
        self.drop = nn.Dropout()
        
    def forward(self, x):
        x = self.lin(x)
        x = self.drop(x)
        return x

model = MyModel()
x = torch.randn(100, 10)
out = model(x)
print((out==0.0).sum())
> tensor(480)

model.eval()
out = model(x)
print((out==0.0).sum())
> tensor(0)

model.train()
out = model(x)
print((out==0.0).sum())
> tensor(501)

model.apply(de_apply_dropout)
out = model(x)
print((out==0.0).sum())
> tensor(0)

model.apply(apply_dropout)
out = model(x)
print((out==0.0).sum())
> tensor(475)

Thank you.
I got it.