Hi,
I’m currently using torch.distributed and want to pass non-tensor type (e.g. dictionary, numpy etc.) elements from one process to another. I have considered making non-tensor type to tensor type before, but the problem is that each element has a different size so that it is hard to gather them to one local process. Do you have any idea?