dropout = nn.Dropout(0.1)
if dropout is not None:
dropout.p = 0.2
can you do this? can you change the dropout on the fly after you have declared your model class?
dropout = nn.Dropout(0.1)
if dropout is not None:
dropout.p = 0.2
can you do this? can you change the dropout on the fly after you have declared your model class?
seems like the this should work as Dropout layer uses self.p as parameter for the nn.functional.dropout function
https://pytorch.org/docs/stable/_modules/torch/nn/modules/dropout.html#Dropout
class Dropout(_DropoutNd):
def forward(self, input):
return F.dropout(input, self.p, self.training, self.inplace)