I was expecting that the size of a given tensor with x.auto_grad=True should be different from x.detach() or y = x.detach().clone() Can someone explain why the sizes of both objects are the same?

Thanks

EDIT: I mean the actual size on desk not the shape of the tensor.

Hello @AlJazari
Let me clarify, what do you expect to see when tensor requires and does not requires grad?

If you detach tensor from the graph then itâ€™s requires_grad becomes False and grad property becomes None. In other words, the size of a tensor (means shape) stays the same, but grad property disappears

I was assuming the graph itself is attached to the tensor, so I was expecting that the size (i.e., actual size on desk in bytes) is different for a tensor with a graph versus a tensor detached from the graph.

For example, if the information in the graph is irrelevant to me, then the memory occupied by storing the tensor on desk is the same whether this tensor includes the graph or detached from it.