Custom weight initialization

Ah sorry about that, I forgot to mention torch.nn.Parameter, which basically makes the weight recognizable as a parameter when you then do sth like

optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)  

So, in your case, the following should fix it:

    with torch.no_grad():
        self.conv1.weight = torch.nn.Parameter(K)

EDIT: I am actually not sure if it even requires no_grad, because this is still in the __init__ part and not in the forward call.

2 Likes