How to pass non-tensor type elements from one process to another with torch distributed


I’m currently using torch.distributed and want to pass non-tensor type (e.g. dictionary, numpy etc.) elements from one process to another. I have considered making non-tensor type to tensor type before, but the problem is that each element has a different size so that it is hard to gather them to one local process. Do you have any idea?

I’ve got an idea from the repository of official Mask RCNN. They cast any picklable object into byteTensor and gather it from each rank.