How to get all the weight (not all paramerters)

Hi, I would like to know the max(abs(weight)) of every epoch. As show in figure.
bit位数.
I think I can get max(abs(weight)) in epoch loop, for example:

for epoch in range(1, epochs + 1):
    train_loss,t_accuracy = train(model, train_loader, optimizer)
   test_loss, accuracy = test(model, test_loader)
   epoch_nums.append(epoch)
   training_loss.append(train_loss)
   validation_loss.append(test_loss)
   test_accuracy.append(accuracy)
   train_accuracy.append(t_accuracy)
  **output_1=model.fc1.weight.data**
  if (epoch) % 1 == 0:
      print('Epoch {:d}: Training loss= {:.4f}, Validation loss= {:.4f},train_accuracy={:.4%},    Accuracy={:.4%}'.format(epoch, train_loss,test_loss,t_accuracy,accuracy))
    *print(max(abs(output_1)))*

But, I can only get the one layer’s max(abs(weight)), can you give me advice to get the all weights of the model, not the parameters!

Thank you very much for your comment and your time!!!

Hi,

You can iterate through model state_dict and get max of all weight matrix. Something like this:

for k, v in model.state_dict().items():
    if "weight" in k:
        print(max(abs(v)))
1 Like

Thank you for your reply. :heart:
This is indeed a good idea…but when I try this comment in weight, I get the error: “bool value of Tensor with more than one value is ambiguous”…
I can get the max(abs(bias))…maybe I know the problem, but I don’t know how to solve it :joy:

Thank you again…
Best regards

Hi,

Sorry for the late reply. You need to determine dimension along which you need the max value from the weight tensor becuase different layers will have different multiple dimensions, for example linear layers will have (out_features,in_features) shape with 2 dimensions.

1 Like

Oh, no, no…don’t need to say sorry, your help is my pleasure.
Maybe I can understand your meaning, but I don’t know how to specify one dimension…
I am so sorry I am a new beginner :joy:
But I try…

Thank you!

No worries, consider following example.

x = torch.randint(10, (3,3))
tensor([[8, 4, 5],
        [3, 0, 3],
        [9, 4, 6]])

consider x is your weight matrix. And the shape of x is torch.Size([3, 3]).
when you try to find max value with x.max(dim=0) where dim is the argument along which you are asking for max value, which in our case is 0 means we are asking for max values from all the columns so it will return

tensor([9, 4, 6])

And when you try to find max value with x.max(dim=1), in this case, dim is 1 means we are asking for max values across all the rows so it will return

tensor([8, 3, 9])

So this difference i was trying to convay by

I get it!!!
Thank you so much!!

You can flatten the model then the max of the numpy aray. Just pass your net to below function it will give you fattened numpy array of model and al it’s shapes. Then you can apply all numpy operations on that.

#############################################################################
# Flattening the NET
#############################################################################
def flattenNetwork(net):
    flatNet = []
    shapes = []
    for param in net.parameters():
        #if its WEIGHTS
        curr_shape = param.cpu().data.numpy().shape
        shapes.append(curr_shape)
        if len(curr_shape) == 2:
            param = param.cpu().data.numpy().reshape(curr_shape[0]*curr_shape[1])
            flatNet.append(param)
        elif len(curr_shape) == 4:
            param = param.cpu().data.numpy().reshape(curr_shape[0]*curr_shape[1]*curr_shape[2]*curr_shape[3])
            flatNet.append(param)
        else:
            param = param.cpu().data.numpy().reshape(curr_shape[0])
            flatNet.append(param)
    finalNet = []
    for obj in flatNet:
        for x in obj:
            finalNet.append(x)
    finalNet = np.array(finalNet)
    return finalNet,shapes
2 Likes

I have try it. That’s great! Thank you very much!

Dear Sir, I am using your function flattenNetwork() to get the network parameters/weights and shape. I manipulate the weights and I don’t change the shape of the network. How can I get back the original network with manipulated weights? Please help. Thank you.

Not sure what you mean exactly. Do you mean you want to place updated weights in the network back?