cosmmb
February 14, 2018, 7:43am
1
Hi there,
I have a question about how to update or copy values from one variable to another without breaking the computation graph for back-prob.
I know there is one to do it by X.data.copy_(Y.data), X and Y are both variables. But this .data will break the computation graph for back-prob.
Is there anyone knows to do this?
Thanks!
jpeg729
(jpeg729)
February 14, 2018, 7:56am
2
What about X = Y.clone()
?
cosmmb
February 14, 2018, 4:43pm
3
Thanks for your reply!
What if I want to do something like X[:,1]=Y.clone()? This always returns me an error like
“in-place operations can be only used on variables that don’t share storage with any other variables, but detected that there are 2 objects sharing it”
richard
February 14, 2018, 5:25pm
4
What version of pytorch are you using?
richard
February 14, 2018, 11:08pm
6
The following works for me on pytorch 0.3.1. What code are you running to get that error?
x = Variable(torch.randn(3, 3))
y = Variable(torch.randn(3))
x[:,1] = y