Since I have a problem with vanishing gradients while training my model I added a writer to monitor the models gradients
for name, param in self.model.named_parameters():
self.writer.add_histogram(f"{name}.grad", param.grad, iteration)
this lead to ValueError("The histogram is empty, please file a bug report.")
even after changing that to
for name, param in self.model.named_parameters():
if torch.isnan(param.grad).any():
self.logger.warn(
f"Gradient {name}.grad is NaN in iteration {iteration}"
)
else:
self.writer.add_histogram(f"{name}.grad", param.grad, iteration)
I still get the same error.
This issue The histogram is empty · Issue #31855 · pytorch/pytorch · GitHub was closed as resolved but multiple user still seem to have the same issue.
Is there something I am missing?
Thanks in advance