Using indices or a mask should also work, but you wouldn’t get a view and I don’t know if any approach could satisfy a view in this case:
A = torch.randn(4)
B = torch.randn(3)
C = torch.empty(len(A)+len(B))
mask = torch.tensor([True, False, False, True, True, True, False])
C[mask] = A
C[~mask] = B
This is interesting! Thank you for the suggestion!
So in both these approaches we fill a new tensor C using A and B values.
In my use case, I need C for a later matrix multiplication so I’d like to use the additional dim resulting by stack (you can think of it as a batch dimension).
A = torch.randn(3,2)
B = torch.randn(3,2)
conditions = torch.tensor([True, False, False, True]) # size (4)
C = torch.stack([A if c.item() else B for c in conditions], dim=0) # size (4,3,2)
D = torch.randn(4,2,5)
E = torch.matmul(C, D) # size (4,3,5)
Is there a way to create an ephemeral C that uses A and B storages just once? It would be great avoiding the allocation of the whole C tensor.