Initialize model with sign of another model

How to initialize one model with the (parameter) sign of another model?
And I also want to modify parameters one unit by one for each layer. Maybe they are the same way.

You can just modify the parameters. In torch 0.4 or master, you’d do:

import torch
l = torch.nn.Linear(2,3)
l2 = torch.nn.Linear(2,3)
with torch.no_grad():
    l2.weight.copy_(torch.sign(l.weight))
print(l.weight, l2.weight)

in torch 0.3, you operate on weight.data instead of weight directly, but can do it the same way otherwise.

You can also use your favourite way of accessing parameters (e.g. model.parameters(), model.named_parameters(), model.state_dict()) instead of explicitly coding them.

Best regards

Thomas

Thanks for your reply. I just want to modify the question to add my new code:

   #initialize Binary Network with sign(W)
    state_dict = model_raw.state_dict()
    state_dict_quant = OrderedDict()
    for k, v in state_dict.items():
        state_dict_quant[k] = torch.sign(v)
    model_B.load_state_dict(state_dict_quant)

for the load_state_dict fuction, I think the dict of model_B and model_A should have the same keys, so maybe it is not a good solution