Questions about tensor and saving it in Memory

Hello, I’m trying to understand how tensors are stored in memory. Suppose we have a tensor:

tensor = torch.tensor([1, 2, 3])

The metadata will store information such as the size, shape, and strides, while the storage object will store the data (1, 2, 3).

My questions are:

  1. Are the values in the tensor stored at unique offset addresses in memory?
  2. Where is the base address of the storage object mentioned above saved? Is it stored in the metadata? I want to comprehend how the tensor accesses the storage object.

This post might be helpful to learn more about PyTorch internals.
The data is stored at the starting address defined in tensor.data_ptr() and each value can be accessed by indexing operations using the size and stride.

1 Like