I don’t think I got your question properly. But is this what you are trying?
a = torch.empty(10,2).to(device)
b = torch.empty(5,2).to(device)
batch_size = 5
a[:batch_size, :] = b
Here, the device can be any “cpu” or “cuda”. numbers 10,5,2 are random examples for demonstration (instead of using N, batch_size, M, but the order remains same).
Close, but no, I’m trying to implement a kind of queue. So first in, first out. Your 4th line of code overwrites the top of a, but I want to shift a so that the top is preserved and insert b before it, if that makes sense.
RuntimeError: unsupported operation: some elements of the input tensor and the written-to tensor refer to a single memory location. Please clone() the tensor before performing the operation.