Reassigning model parameters whose size is changed due to pruning

Hi, I obtained my model’s parameters with

old_params = {}
for name, params in model.named_parameters():
old_params[name] = params.clone()

Then I played with weights and biases ended with downsized network. Since the sizes of layers are changed I can’t reassign weight to model.named_parameters.

for name, params in model.named_parameters():[name])

However, I need to because I am gong to compare runtimes of forward functions. In other words, I want to compare apple to apple. Any idea how to handle it would be nice.

I assume you’ve sliced some parameters to downsize your model.
How did you create the logic to remove some parameters?
Would the same (slicing) operations work on the cloned parameters?

PS: I’m not sure, if that would be useful regarding the performance of the model.
If you just want to compare the runtimes, random values might also work just fine.

Thank for response. Assume I have FCLs like (400, 784), (400, 400), (100, 400) and (10, 100).
After downsizing operations on cloned parameters, new network has a shape (350, 784), (125, 350), (125, 15), (10, 15). At this point,I can’t reassign this downsized network to the network whose shape is different as expected. Just want to use model(xb) as in tutorials for both original network and cropped network.

But referring to your idea I can define a new network with a size of downsized network, train it and run same test data on it. To measure difference, it should work fine, right?

I think it should work.
It looks like you’ve posted the shapes of the weight matrices of both models.
However, if that’s the case, the shapes of the weight matrices of the downsized model don’t fit together. Is this a typo or do I misunderstand something?