How to add graphs to hparams in tensorboard?

Having dug a little deeper into torch.utils.tensorboard, I can’t seem to find how to use this functionality. However I can achieve it by modifying the SummaryWriter.add_hparams() function as such:

			def add_hparams(self, hparam_dict, metric_dict, hparam_domain_discrete=None, run_name=None):
				torch._C._log_api_usage_once("tensorboard.logging.add_hparams")
				if type(hparam_dict) is not dict or type(metric_dict) is not dict:
					raise TypeError('hparam_dict and metric_dict should be dictionary.')
				exp, ssi, sei = hparams(hparam_dict, metric_dict, hparam_domain_discrete)

				self.file_writer.add_summary(exp)
				self.file_writer.add_summary(ssi)
				self.file_writer.add_summary(sei)
				for k, v in metric_dict.items():
					if v is not None:
						self.add_scalar(k, v)

To use this, when you create the SummaryWriter you should set log_dir=log_dir+'/[run_name]'. I have some example code and pictures to show how to use this below.

s = SummaryWriter(log_dir=f'{log_dir}/run1')
h_params = {'This': 1, 'is': 2, 'a': 3, 'test': 4}
metrics = {'accuracy/accuracy': None, 'loss/loss': None}

accuracy_values = [(1, 0.6), (2, 0.8), (3, 0.9), (4, 0.95)]
loss_values = [(1, 3), (2, 1), (3, 0.5), (4, 0.11)]
for step, v in accuracy_values:
	s.add_scalar('accuracy/accuracy', v, step)
for step, v in loss_values:
	s.add_scalar('loss/loss', v, step)

s.add_hparams(h_params, metrics)
s.close()

Something that I’m not sure if it’s a bug or not is that in the original code, any scalars are logged to the log_dir whilst hyper parameters are logged in their own directory and given a run_name.

(I’ve just posted this as hopefully this is useful to someone)

3 Likes