torch.cuda.comm.reduce_add_coalesced() can handle a list of tensors with different size, but gather() can not.
Is there a way to make gather supporting different tensor size?
Thanks
torch.cuda.comm.reduce_add_coalesced() can handle a list of tensors with different size, but gather() can not.
Is there a way to make gather supporting different tensor size?
Thanks
I don’t know if it would solve your problem, but you can use gather with different tensor sizes by repeating tensors along added dimensions:
values = torch.rand(3,7)
indices = (7*torch.rand(5,6)).long() # indices for values's 2nd dimension
repeated_values = values.view(3,7,1,1).expand(3,7,5,6)
repeated_indices = indices.view(1,1,5,6).expand(3,1,5,6) # note the 1 on the dimension for gathering
gathered = repeated_values.gather(1, repeated_indices).squeeze() # tensor of size 3,5,6
Sorry for my unclear description. I think you’re focus on the torch.gather method. And torch.cuda.comm.gather method is what I’m using which can communicate data between multiple GPUs and gather data to specific device.
Thanks
http://pytorch.org/docs/master/cuda.html?highlight=gather#torch.cuda.comm.gather