Use dropout from specific score on train/validation

Hi, I have a problem with my model due to low amounts of data.
If I use dropout layer, sometimes it is unable to learn and get stucked on local minimum of the loss.
If I don’t use it, the model successfully learns all the time but there is overfit.
Basically, I want to use the dropout layer after I made sure the model has surpassed the local minimum that caused me those problems.
I thought about the following

def forward(self,x,score=None):
    x = self.seq1(x) # each seq has layers,activation functions and batchNorm
    if score is None or (score is not None and score > threshold):
        x = self.dropout(x)
    x = self.seq2(x)
    return x

Do you think it is a valid way to do so? will it cause me problems with the torch autograd or optimizer.step()?
if you can come up with a better way of doing so, please let me know.
Thank you in advance,
Kfir