How to compute/plot magnitude of gradients of each layer

Hello,

I would like to know if there is a straightforward way (less memory consumption) to compute the magnitude of gradients of each layer at every epoch and plot them with tensorboard ?

Hi @DeepLearner17,

You can compute the norm efficiently via a view like so,

norms = {}
for name, param in model.named_parameters():
  norm[name] = param.grad.view(-1).abs().pow(2).sum(-1).sqrt()

Then you can pass these to Tensorboard like normal.

EDIT: Added .grad attribute to compute Euclidean norm of gradient.

Thank you @AlphaBetaGamma96.
It’s not the weights of the layers but the gradients associated to each weight.

norms[name] = param.grad.abs().view(-1).pow(2).sum(-1).sqrt()