# Distance between two sets of neural net weights

Hi all,

Is there a quick way to access (then plot) the l2 norm of the distance between the initial set of weights w_0 and a set of weights at iteration t, w_t ?
I’d like to access this quantity to then plot it in 2D (along a random dimension).

I believe `net.parameters()` where net is defined as below

``````class MnistNetSmall(nn.Module):
def __init__(self):
super(MnistNetSmall, self).__init__()
self.conv1 = nn.Conv2d(1, 20, 5, 1)
self.fc1 = nn.Linear(20 * 12 * 12, 10)
#self.fc1 = nn.Linear(28 * 28, 10)

def forward(self, x):
x = F.relu(self.conv1(x))
x = F.max_pool2d(x, 2, 2)
x = x.view(-1, 20 * 12 * 12)
#x = x.view(-1, 28 * 28)
x = self.fc1(x)
return F.log_softmax(x, dim=1)

net = MnistNetSmall()

``````

is what I need. Though this quantity is a generator object.

Any ideas how to collect `net.parameters()` at each iteration for plotting purposes?

Thanks all

You Can use this approach:

If you want conv1 layers weights:

``````conv_weights = net.conv1.weight
``````

or fc1 layers weights:

``````fc1_weights = net.fc1.weight
``````

as these layers are defined in` __init__` you can access them by name and get weights

Hope this helps.

Indeed that was the syntax.
Though, when I try to store the weights of say my first fully connected layer as in

``````all_weights = []

for epoch in range(start_epoch, start_epoch+epochs):
train_loss = 0
correct = 0
total = 0
ip_loss = [];
grad_loss = []
for batch_idx, (inputs, targets) in enumerate(trainloader):
inputs, targets = inputs.to(device), targets.to(device)
optimizer.zero_grad()
outputs = net(inputs)
loss = criterion(outputs, targets)
loss.backward()
_, diag_args = optimizer.step()
ip_loss.append(diag_args['ip_loss'])
grad_loss.append(diag_args['grad_loss'])
train_loss += loss.item()
_, predicted = outputs.max(1)
total += targets.size(0)
correct += predicted.eq(targets).sum().item()
logger_diag.append([lr, momentum,newstat, train_loss, 100.*correct/total, np.sum(ip_loss)])

#Append weights of fc1 through the iterations
all_weights.append(net.fc1.weight.data.detach().numpy())
``````

Those weights appear to be constant through the iterations/epochs.
The weights should be updated at each `optimizer.step()`
Any idea why?

Are you sure the model is training ? Does losses go down as training ?

Yes training is happening.
Copying the array of weights and appending made it work.
Don’t know why but I guess copy() is always a good friend.

``````a  = net.fc1.weight.data.detach().numpy().copy()
all_weights.append(a)
``````