What is the difference between "Y=X" and "Y=X.clone()"?

I am currently implementing residual connection on my architecture. I found codes that use .clone() at a junction. But, will I have a problem if I don’t use .clone() for skip connection?
This is the simplified code of my model.

 def forward(self, input):

    out = self.conv1(input)
    out = self.BatchNorm1(out)
    out = self.ReLU(out)
    out = self.conv2(out)
    out = self.BatchNorm1(out)

    residual_output = input + out
1 Like

In this version of pytorch, x.clone() can be regarded as an x+0 operation.

1 Like

When you are assigning a Tensor to another, they will share the underlying data.
Using .clone assigns the data to the new Tensor.


a = torch.zeros(10)
b = a
c = a.clone()
a[0] = 1

Probably in the code you’ve seen, they will manipulate X later on and wish to keep Y as it is.


This shows you that a and b are different objects. Clone seems to create a copy of the tensor.

As per the docs:

Returns a copy of the self tensor. The copy has the same size and data type as self.