**Question**: How can I assign values to tensor `p`

, while not loosing the shared storage with `t`

, such that every change on `p`

will be visible in ` t`

? Note that I need to avoid element-wise iteration.

`Context`

:

I have a huge tensor `t`

with dim n x m. And another huge tensor `v`

with dim l x k, containing information that has to be parsed somehow into `t`

.

During a loop, I always get a piece of `t`

out. I call it `p`

. Similar to the following code (assuming that the array pos is given):

```
for i in range(0, len(pos)-1):
p = t.narrow(0, pos[i], pos[i+1])
# now assign values to p that come from v by some parsing logic.
```

Since `t`

and `p`

share the same underlying storage:

If I assign values to `p`

these should be also assined to `t`

. (Which in fact works, if I manually do `p[0][0]=1,`

for example.)

Due to efficiency problems I cannot loop through every element of t by using indices.

Since my parsing logic is encoded in an index array, I thought that` torch.gather`

would perfectly fit my needs.

Unfortunately the returned tensor is a new copy of the data. Such that `p`

is assigned correctly to the values but `t`

is not (since the reference got lost).

Something like:

`p = v[index].clone().detach()`

Did not work as well.

**Note**: I cannot operate on `t`

directly due to some program logic.