Random initialization within range on pretrained model

Hello,

I would like to set the weights and biases of my pytorch model (which is already trained) randomly within a range. For instance, let the value for of a weight be 12. I’d like it to be set to a random value from a custom range (6-18 for example); and so with every weight/bias. It’s like a naive random search, although I’m just trying to experiment with the model.

One example could be this (in pseudocode):

for weight, bias in model:  // for the entire model
    weight = random(weight/2, weight*2)
    bias = random(bias/2, bias*2)

How can I set those randomly within that custom range and save the new model (as a .pt for instance)?

Thanks,

you could do something like

layer.weight.data.fill_(random_weight)
layer.bias.data.fill_(random_bias)

then to save you it you would just go

torch.save(model.state_dict(), "new_model.pt")

The only problem with this is you have to set each layer individually.

Thanks for your answer. But wouldn’t layer.weight.data.fill_(random_weight) set all the values in that layer to the same value? What I want is to set each weight/bias to a random value within the range of (weight-2, weight+2 for example), and not all weights/biases from each layer to the exact same number. It should variate depending on the current values the model already has.