Hi! I am trying to allocate several torch.tensor continuously, can we somehow do that? Thank you!!!
Well…their size are not equal…Allocate one big tensor and split it might be…not very practical…Otherwise maybe I can create a (1, m) and then split it and reshape each one??? Maybe…but dirty…
Thank you!!!