meipuru344
(Meipuru344)
November 26, 2018, 3:52am
#1
Hi guys.

i got some weird output from pytorch

my environment is bellow
pytorch 0.4.1

My code is

loss = torch.ones(1)*10
a = loss.detach()
print(a)
#return tensor([10.])
loss += 20
#return tensor([30.])
print(a)

is this normal?
then how can i copy tensor safely

i got same result from
a = loss.data

I appreciate any reply!

Thanks.

meipuru344
(Meipuru344)
November 26, 2018, 4:06am
#2
I could get a correct result.
it seems
a += b is deprecated in PyTorch ?

```
loss = torch.ones(1)*10
a = loss
print(a)
#return tensor([10.])
loss += 20
print(a)
#return tensor([30.])
```

```
loss = torch.ones(1)*10
a = loss
print(a)
#return tensor([10.])
loss = 20 + loss
print(a)
#return tensor([10.])
```

SimonW
(Simon Wang)
November 26, 2018, 7:17am
#4
+= modifies inplace. Both .data and .detach don’t copy, but share storage with the original tensor. So you see this behavior. If you want a copy, use .clone().

1 Like

meipuru344
(Meipuru344)
November 26, 2018, 9:47pm
#5
Simon

Thanks for your reply
i didnt know clone()
i appreciate it