torch.Storage reusing memory

I was curious, let’s say I have a tensor x=torch.ones(10). Then I do x=x[:5]. Now x.storage() is still a storage of size 10 even though I have a view of size 5. If I have no other reference pointing to that part of the storage, can another tensor use it automatically if it is needed? Or is it considered occupied?

It is considered occupied. If you want to release it, you’ll have to make a copy of the view, which will allocate a new tensor of the right size AFAIK.