Tensorboard with multiprocessing

Hi all,

Do you know if I can run torch.utils.tensorboard along with multiprocessing? If yes can anybody provide some hints/code? I tried tensorboardX with multiprocessing and the problem that I had is that each spawned process was initialising its own SummaryWriter. I know that the solution is only the master process to initialise SummaryWriter and this should be used globally by all processes, but since Im not experienced with multiprocessing I dont know how to make this work. In tensorboardX, a recent commit implements GlobalSummaryWriter (link) but I couldn’t have this running either, was getting errors, although following exactly the documentation from github on how to use it, its still in experimental mode so Im guessing there are still bugs.

Thank you.

Hi @overfitted, did you find a solution to this problem? Thank you.

If you’re using torch.multiprocessing (which you probably should be doing), you’ll get the process index as the first parameter of your entry point function. You can consider index 0 to be your master process and do all of your summary writing in that process.

def my_entry_point(index):
  if index == 0:
    writer = SummaryWriter(summary_dir)
    writer.add_scalar('test', 1.0, 1)

torch.multiprocessing.spawn(my_entry_point, args=(), nprocs=2)
1 Like

Hi,

Thanks for your answer! I will try and let you know. Cheers!