+= with tensor copying

Hi guys.

i got some weird output from pytorch

my environment is bellow
pytorch 0.4.1

My code is

loss = torch.ones(1)*10
a = loss.detach()
print(a)
#return tensor([10.])
loss += 20
#return tensor([30.])
print(a)

is this normal?
then how can i copy tensor safely

i got same result from
a = loss.data

I appreciate any reply!

Thanks.

I could get a correct result.
it seems
a += b is deprecated in PyTorch ?

loss = torch.ones(1)*10
a = loss
print(a)
#return tensor([10.])
loss += 20
print(a)
#return tensor([30.])
loss = torch.ones(1)*10
a = loss
print(a)
#return tensor([10.])
loss = 20 + loss
print(a)
#return tensor([10.])

+= modifies inplace. Both .data and .detach don’t copy, but share storage with the original tensor. So you see this behavior. If you want a copy, use .clone().

1 Like

Simon

Thanks for your reply
i didnt know clone()
i appreciate it :slight_smile: