@KFrank

Thanks! The answer is quite detailed.

As I know, Python separates data type into `Mutable data type`

(i.e. When the value changes, the memory address does not change) and `Immutable data type`

(i.e. When the value changes, the memory address changes, that is, the id changes). According to the results that you have showed, I think torch.tensor is similar with list in memory management.

So I did the following experiments as the example 6. But I found that the results were a little different.

Here is the script of tensor:

```
# example 6 of torch.tensor
import torch
tensor = torch.arange (5)
a = tensor[0]
print ('tensor:', tensor)
print ('id (tensor[0]):', id (tensor[0]))
print ('a:', a)
print ('id (a):', id (a))
tensor[0] = 99
print ('tensor:', tensor)
print ('id (tensor[0]):', id (tensor[0]))
print ('a:', a)
print ('id (a):', id (a))
```

Here is the output of the script of tensor:

```
tensor: tensor([0, 1, 2, 3, 4])
id (tensor[0]): 1663486027312
a: tensor(0)
id (a): 1663486027096
tensor: tensor([99, 1, 2, 3, 4])
id (tensor[0]): 1663504471168
a: tensor(99)
id (a): 1663486027096
```

Here is the script of list:

```
tensor = [0, 1, 2, 3, 4]
a = tensor[0]
print ('tensor:', tensor)
print ('id (tensor[0]):', id (tensor[0]))
print ('a:', a)
print ('id (a):', id (a))
tensor[0] = 99
print ('tensor:', tensor)
print ('id (tensor[0]):', id (tensor[0]))
print ('a:', a)
print ('id (a):', id (a))
```

Here is the output of the script of list:

```
tensor: [0, 1, 2, 3, 4]
id (tensor[0]): 1522970816
a: 0
id (a): 1522970816
tensor: [99, 1, 2, 3, 4]
id (tensor[0]): 1522973984
a: 0
id (a): 1522970816
```

The value of a is different in two scripts. So it seems that memory management of tensor is not exactly the same with list?

And I don’t quite understand why the value of a changes in the script of torch.tensor. In my opinion, the whole process should be that variable a refers to the original tensor[0] address, and tensor[0] refers to the new address of 99. But why the value of original tensor[0] changes?