Set model weights to preset tensor with torch

I am attempting to train a torch model with neuro-evolution. I cannot seem to be able to set weights of a model to a preset tensor. I am able to:

Assign weights based on random values,

             for param in i.model.parameters():
       = torch.rand(

But i cannot seem to do something like set all the models equal to the fittest model:

        for i in self.botList:
            for param in i.model.parameters():
       = self.botList[self.fittest.index].model.parameters():

Where self.botlist is my object array of genomes, and self.botList[self.fittest.index].model.parameters() loads the fittest model parameters.

Any help would be appreciated, this is my first time using torch. Thanks, Preston


I just tried to save and load the model.state_dict()

    path = "./"
    fittest_dict =[self.fittest.index].model.state_dict(),path)

It creates a file, but always produces a RuntimeError: method 'detach' already has a docstring

In the following line of code:

the left hand side of the equation refers to the weights of one layer, but in the right-hand side refers to all model parameters. I think you would want to put the best model and the current model in a for-loop with zip() function. Assuming that you have two models current_model and best_model, then, the following should work:

for param_cur, param_best in zip(current_model.parameters(), best_model.parameters()): =

Thank you for your prompt reply.
Your solution was just what i needed. Thank so you much!

Sure, you are welcome! I am glad it worked! :blush:

Hey, so there is a slight problem… I have an array of Symmetricmodel() objects. Whenever i run your code and then try to modify the object’s parameters() , every one of the parameters() in my object array is changed. Do you know of a way to fix this?

1 Like

also… does the parameters().data contain bias?

Regarding the symmetricmodel, I do not know what that is. Can you provide some code so that I can see what it is?

Wasn’t that what you wanted? Or do you want to change only certain layers and keep the others fixed?

About your other question:

Yes. The parameters() will include both weights and biases.

1 Like

I think I may have solved my problem by adding require_grad=False. Assigning values to certain tensors caused others to be altered. It was really weird. At this point, some models with the exact same weight perform much differently, but that might just be my computer

Here’s my code.

1 Like