Object expands in memory despite size remaining constant

Hi, I’m running into the following issue. I have a list of tuples of the form

a = [(torch.Tensor(3), 4), (torch.Tensor(5), 6)]

and I want to add something of a similar form like

b= [(torch.Tensor(5), 6), (torch.Tensor(10), 3)]

I then want to add those two together to get something like

a = a + b
a = [(torch.Tensor(8), 10), (torch.Tensor(15), 9)]

when I do I notice that the memory of a is larger than it was before despite the fact that its of the same size and I can’t figure out why it increases and what I can do to fix it (I’m doing this on a larger scale which leads to me running out of RAM)

One important thing to note that the torch tensors in questions here are the result of some forward passes in a network and as a result have requires_grad set to true. I suspect that the issue is related to the fact that additional memory is needed for the tensor to remember all the variables needed for it to compute gradients properly, does that seem right? If so is there a way I can mitigate increased memory usage or is it just inherent if I want to keep requires_grad to be true? (which is required for my problem

Any help would be appreciated, thanks!

Can you print out the result of a. To me, it looks like the list will contain all 4 elements. You will have to add elementwise to get the result you need.