How to store a tensor in a temporary variable

I am new to pytorch I have a for loop running, where a tensor (of which the size is unknown) is being returned. I need some way out to store the tensor returned in a temporary variable and use it to compare with the tensor in the next loop. But I dont know how to initialize a tensor of unknown size.

I may have misunderstood your needs, but Iā€™d just stick it in a python variable as you normally would. e.g.

for stuff in some_list:
    output = do_stuff_with(stuff)
    # compare temp and out
    temp = output

tensor = torch.Tensor() gives you an empty tensor. You can resize this with tensor.resize_as_(output) when you have another tensor.

1 Like

Thanks @richard and @jpeg729, it worked.