Can we allocate several torch.tensor continuously?

Hi! I am trying to allocate several torch.tensor continuously, can we somehow do that? Thank you!!!

Well…their size are not equal…Allocate one big tensor and split it might be…not very practical…Otherwise maybe I can create a (1, m) and then split it and reshape each one??? Maybe…but dirty…

Thank you!!!

Allocating a large tensor and then slicing it, might work. However, what’s your use case and why do you depend on this memory allocation?

yes…I can …Well, I am developing CUDA program. Anyway, your way is OK…Thank you!!