Hi, I want to implement cross-validation efficiently. Therefore, I want to build one model(/neural network) and then reset the variables randomly for each fold in the cross-validation.
Is there a way to reset variables’ values just the same as creating a new model?
Some example codes are
self.fc = torch.nn.Linear(3,4)
toy = Toy()
cross_validation_acc = 
for fold_num in range(5):
Thank you very much!
You could simply write an initialization function, which could look like this:
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
elif isinstance(m, nn.BatchNorm2d):
and then call
toy.reset_parameters() at the beginning of each fold.
Note: The example has been taken from the torchvision github repo and does not contain an initialization for
torch.nn.Linear. You would have to adapt this for your usecase.
Great. I will also search the default initialization methods for other modules. Thank you very much for your help!
Great! But if you don’t use SGD without momentum as optimizer, you would also have to reinitialize your optimizer’s
state_dict for comparable results.
So, it’s like to firstly record the
Optimizer.state_dict() and then use
Optimizer.load_state_dict() to reset?