How to update values in Variable?

Hi there,

I have a question about how to update or copy values from one variable to another without breaking the computation graph for back-prob.

I know there is one to do it by X.data.copy_(Y.data), X and Y are both variables. But this .data will break the computation graph for back-prob.

Is there anyone knows to do this?

Thanks!

What about X = Y.clone()?

Thanks for your reply!

What if I want to do something like X[:,1]=Y.clone()? This always returns me an error like

“in-place operations can be only used on variables that don’t share storage with any other variables, but detected that there are 2 objects sharing it”

What version of pytorch are you using?

i am using pytorch 0.3.0

The following works for me on pytorch 0.3.1. What code are you running to get that error?

x = Variable(torch.randn(3, 3))
y = Variable(torch.randn(3))
x[:,1] = y