Logging gradients on each iteration

Hi,

I’d like to log gradients obtained during training to a file to analyze/replicate the training later.
What’s a convenient way of doing this in PyTorch ?

grads = {n:p.grad.cpu() for n, p in model.named_parameters()}

gives you the grads of model's parameters. You can now store them away, either directly on disk (torch.save or, if you feel fancy, hdf5) or keep a list of them (when moving to cpu probably is a good idea, so I threw that in above) or so.

Best regards

Thomas

If you use tensorboardX you could also log the grades to tensorboard to visualise:

from tensorboardX import SummaryWriter
logger = SummaryWriter(LOG_DIR)
    
def log_gradients_in_model(model, logger, step):
    for tag, value in model.named_parameters():
        if value.grad is not None:
            self.logger.add_histogram(tag + "/grad", value.grad.cpu(), step)
1 Like