Does the foreach operators have the same shape?

foreach operators receive TensorList as input. Does the tensors stored in TensorList must have same shape for foreach kernels?

No as seen in this small example:

device = "cuda"
l = [torch.randn(20, 20, device=device), torch.randn(10, 10, device=device)]
out = torch._foreach_abs(l)

out = torch._foreach_add(l, 100.)

Note that API is in beta and could thus change.

Thank you, ptrblck. I have another question, when we add new backend, we found that pytorch frontend will do broadcast. For example: x1_shape[2, 3, 4, 5], x2_shape[4, 5]. torch._foreach_add([x1], [x2]), x2 will be broadcasted to [2, 3, 4, 5] automatically. Backend don’t need to do it again. Is this correct? And do you know the code location of these broadcasting?

Could you post an executable code example?

Sorry for late reply because of the forums migration. I have found the answers, the frontend will expand x2 to [1, 1, 4, 5], but don’t change data stored in memory.
Thanks for your support!