Change dropout rate in jit script

Hi,
is it possible to change the dropout rate of a pretrained model (jit script)?
I have a working pretrained model for image classification, where the classification layer is trainable. Dropouts are implemented, but I would like to change the dropout rate in case that less training samples are used to avoid overfitting.
Thanks
Chris

for name, module in model.named_modules():
    if isinstance(module, torch.nn.Dropout):
        module.p = new_dropout_rate

Would something like this work?

Thanks for your answer,
actually this seems not to work. After reading all these jit generated files, I found that the dropout rate is a hardcoded argument. It’s not a parameter (like weights).

Thanks for the update