I was wondering how I would re-initialize the weights of my model without having to re-instantiate the model?
This is not a robust solution and wont work for anything except core torch.nn layers, but this works:
for layer in model.children():
if hasattr(layer, 'reset_parameters'):
layer.reset_parameters()
6 Likes
I ended up saving the initial model parameters into a temporary file and then reloading it at the start of each CV fold.
1 Like
two questions:
- for it to work for any layer, the user has to implement their own layer with that method I assume, right?
- why didn’t you use
.modules()
?
based on this we probably need .modules()
to properly recursively go into all modules in the model:
The C++ way is below
auto net = std::make_shared<Net>();
for(auto pair :net->children())
if(pair->name() == "torch::nn::LinearImpl")
pair->as<torch::nn::LinearImpl>()->reset_parameters();
}
were Net is
struct Net : torch::nn::Module {
......
};