In-place assignment for Variable?

Say I have 2 Variables v1, v2, this is what I want to do:

v3 = torch.cat( [f1(v1), f2(v2)], 0) # f1 and f2 are functions

However, it first creates a memory for saving f1(v1) and f2(v2), and then create a memory for v3.

A desirable way is we can create a memory which has size of v3, and assign the corresponding component to be f1(v1) and f3(v2), like

v3 = create_placeholder() # create a placeholder Variable
v3[:10] = f1(v1)
v3[:20] = f2(v2)

Is there any solution following this idea???

A more complicated example:
I want to create a Varaible v3 from two variable v1 v2.
Suppose v3 has size (2,3,10), v1 and v2 has size (10)

I want v3 to be:
v3[0,0] = v1
v3[1,2] = v2
0 elsewhere.

How should I construct v3.

Hi,

If input is the output of the cat operation, it should be contiguous already.
So the call to .contiguous() is already a no-op.

Ok. Bad example then. (But still, torch.cat will use additional memory.)

I will change the question.

I don’t think that’s supported, because Variables only track the history for the entire tensor/variable, and your example requires two divergent histories.