Tensorboard JSON dump of all scalars

In essence I simply want to dump all scalars in a json file such that I can import them quickly to matplotlib to create more flexible plots.

Doing that in hindsight after completing the experiment is cumbersome so I was hoping that I could do it at the end of the running experiment circumventing all the laborous tracking of paths-to-file etc.

I am aware that tensorboardX offers the export_scalars_to_json function but that limits it down to tensorboardX and writing_to_scalars is required.

Does anybody have any experience with the EventAccumulator code in tensorboard who could help me out?

I wrote a function which directly exports the data from the SummaryWriter.

In case somebody stumbles over it.

def export_jsondump(writer):

	assert isinstance(writer, torch.utils.tensorboard.SummaryWriter)

	tf_files = [] # -> list of paths from writer.log_dir to all files in that directory
	for root, dirs, files in os.walk(writer.log_dir):
		for file in files:
			tf_files.append(os.path.join(root,file)) # go over every file recursively in the directory

	for file_id, file in enumerate(tf_files):

		path = os.path.join('/'.join(file.split('/')[:-1])) # determine path to folder in which file lies
		name = os.path.join(file.split('/')[-2]) if file_id > 0 else os.path.join('data') # seperate file created by add_scalar from add_scalars

		# print(file, '->', path, '|', name)

		event_acc = event_accumulator.EventAccumulator(file)
		data = {}

		hparam_file = False # I save hparam files as 'hparam/xyz_metric'
		for tag in sorted(event_acc.Tags()["scalars"]):
			if tag.split('/')[0] == 'hparam': hparam_file=True # check if its a hparam file
			step, value = [], []

			for scalar_event in event_acc.Scalars(tag):

			data[tag] = (step, value)

		if not hparam_file and bool(data): # if its not a hparam file and there is something in the data -> dump it
			with open(path+f'/{name}.json', "w") as f:
				json.dump(data, f)
1 Like