How to cut off the relationship between two torch's value?

a = self.a 
a = a[:, :, None]
a[:, j % 10] = b

when I change the value of a ,I find that the value of self.a is changed too. How can I keep the value of self.a?

Hi Zhao!

This is a general python issue rather than an issue specific to pytorch.

a and self.a are python references that both refer to the same object
(in your case a pytorch tensor).

Consider this pure-python illustration of how references work:

>>> l = [1, 2, 3]       # l is a reference that refers to an object that is a list
>>> l_same = l          # l_same is another reference that refers to the same object as l
>>> l_copy = l.copy()   # l_copy refers to a new object that is a copy of the original list
>>>
>>> id (l)              # the object id (location in memory) of the original list
2856103785280
>>> id (l_same)         # l_same refers to the same object as shown by its id
2856103785280
>>> id (l_copy)         # l_copy refers to a different object (different id)
2856103722624
>>>
>>> l                   # contents of (list referred to by) l
[1, 2, 3]
>>> l_same[1] = 42      # modify object referred to by l_same
>>> l                   # contents of l is modified because it's the same object
[1, 42, 3]
>>> l_copy[1] = 999     # modify object referred to by l_copy
>>> l                   # change not reflected in l because l_copy is a different object
[1, 42, 3]

To make an independent copy of a pytorch tensor, use .clone():

a = self.a.clone()

(If self.a is part of the computation graph and you don’t want the cloned
copy a to also be part of the computation graph, use .detach() first.)

Best.

K. Frank