Reset model weights


(Vadim "Paddy") #1

I would like to know, if there is a way to reset weights for a PyTorch model.

Here is my code:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 16, kernel_size=5)
        self.conv2 = nn.Conv2d(16, 32, kernel_size=5)
        self.conv3 = nn.Conv2d(32, 64, kernel_size=5)
        self.conv4 = nn.Conv2d(64, 64, kernel_size=5)

        self.pool = nn.MaxPool2d(5, stride=3)
        self.pool2 = nn.MaxPool2d(3, stride=1)
        self.activation = nn.ReLU()
        self.fc1 = nn.Linear(4096, 20)
        self.fc2 = nn.Linear(20, 1)

    def forward(self, x):
        x = self.activation( self.pool(self.conv1(x)) )
        x = self.activation( self.pool(self.conv2(x)) )
        x = self.activation( self.pool(self.conv3(x)) )
        x = self.activation( self.pool2(self.conv4(x)) )
        x = x.view(-1, 64*8*8)

        x = self.activation(self.fc1(x))
        x = self.fc2(x)
        return x

I just to want average couple runs of the model, in order to evaluate it.

Any idea how can I do that?


#2

You could save the state_dict and load it for resetting the model. Have a look at the Serialization Semantics to see how to do it.
Would this work for you or do you want to re-initialize it to random weights?


(Vadim "Paddy") #3

thank you for the hint.

However, is there a way for the random re-initialization?


#4

Sure! You just have to define your init function:

def weights_init(m):
    if isinstance(m, nn.Conv2d):
        torch.nn.init.xavier_uniform(m.weight.data)

And call it on the model with:

model.apply(weight_init)

If you want to have the same random weights for each initialization, you would need to set the seed before calling this method with:

torch.manual_seed(your_seed)