I keep having this recurrent problem when instantiating my model for the second time when performing k-fold cross validation.
The model is initialised with the following hyperparameters:
input_size = 150
hidden_size = 256
embedding_dimensions = 512
num_layers = 2
cell_type = 'LSTM'
embedding_dropout = 0.0
dropout = 0.0
learning_rate = 0.0001
weight_decay = 0.00
model = Model(input_size, hidden_size, embedding_dimensions, num_layers, cell_type,
embedding_dropout, dropout, learning_rate, weight_decay)
Although I set the self.embedding_dropout as 0.0 and it gives no problem when the model is built at the first go, the following error is thrown when the model is built for the next fold:
TypeError Traceback (most recent call last)
<ipython-input-21-03c9ed583ab2> in <module>
4
5 model_trainer = Train(model, epochs, batch, augmentation_factor_train, augmentation_factor_val)
----> 6 model_trainer.cross_validate(train, val, 10)
<ipython-input-17-81438c5495cd> in cross_validate(self, training_set, validation_set, k_folds, split)
178 print('Fold {}/{} ...'.format(k+1, self.kfolds))
179
--> 180 self.run(k_train, k_val)
181
<ipython-input-17-81438c5495cd> in run(self, training_set, validation_set)
216
217 """
--> 218 self.model.build()
219
220 if self.pretraining_model is not None:
/projects/cc/kdqm927/PythonNotebooks/model/model.py in build(self)
73 self.embedding = nn.Embedding(self.input_size, self.embedding_dimensions).to(self.device)
74 self.input_rnn = self.embedding_dimensions
---> 75 self.embedding_dropout = nn.Dropout(p = self.embedding_dropout)
76 else:
77 self.input_rnn = self.input_size
~/.conda/envs/dalkeCourse/lib/python3.6/site-packages/torch/nn/modules/dropout.py in __init__(self, p, inplace)
8 def __init__(self, p=0.5, inplace=False):
9 super(_DropoutNd, self).__init__()
---> 10 if p < 0 or p > 1:
11 raise ValueError("dropout probability has to be between 0 and 1, "
12 "but got {}".format(p))
TypeError: '<' not supported between instances of 'Dropout' and 'int'
I was wondering if there is any way of going round this problem.