How to calculate gradient for each layer?

Hi,

   for epoch in range(80):
        for i, (images, labels) in enumerate(train_loader):
            images = Variable(images.cuda())
            labels = Variable(labels.cuda())
            
            # Forward + Backward + Optimize
            optimizer.zero_grad()
            outputs = resnet(images)
            loss = criterion(outputs, labels)
            loss.backward()
            optimizer.step()

Above is my code, and how can I record each layer’s gradient?

do you want intermediate gradients? or weight gradients?
By record, do you want to print them? or save them?
There are a few threads already answering these questions.

@Chen-Wei_Xie
search on the forums, there are many threads that answer this question.

I am also looking for an answer to the same question, and I did not find an answer on the forum. Could someone post a minimal working example of how this is done?

optimizer.zero_grad()
y = net(x)
loss = criterion(y, target)
loss.backward()
grad_of_params = {}
for name, parameter in net.named_parameters():
    grad_of_param[name] = parameter.grad
7 Likes