Re-initialise weights and biases to Pytorch default in k-fold cross-validation

I’ve defined a function that to optimise weights and biases so that i can run this every fold of a k-fold validation. However, i find that the default Pytorch weights and bias setting is better than my effort to improve on this. So how should i reset the Pytorch default weights and biases each time i pass through the next k fold, without resetting the whole model?

You can call the .reset_parameters() method of the corresponding module to reset the trainable parameters using the default initialization.

I don’t understand this part of the question as I would assume you want to (and should) reset the model in each new run.

Thank you for responding. The weights and biases should be reset for each k-fold, otherwise they will be trained on the previous k-fold, meaning there will be data leakage, ie training on the test data. So i wrote a module to reset only the weights and biases, rather than reseting the whole model which might take longer. In resetting the weight and biases i just used general advice i found on the net, which was appropriate for the model. It’s a couple CNN layers followed by a couple of fully filled layers. However, just from trying to evaluate how good my weight and bias settings were i became aware that the default weights and bias setting by Pytorch was better than the efforts to do this. So what i was wanting to do was to somehow access the default Pytorch weights and bias settings for every k-fold loop, without resetting the model for each k-fold which would take longer. I’m still a bit of a novice in this area, but i hope this makes some sense. Many thanks,