Modify data of a tensor in-place

Hello, I try to modify data of a PyTorch tensor in-place, specifically assign data of a new tensor to the existing tensor. The problem is that I don’t want to reassign the new tensor to old tensor (old = new), because it will make old point to a new memory address. I cannot use copy_, because tensors are not broadcastable, and more than that the new tensors will always be larger than the old one (it is the result of concatenation).
I tried to use data attribute and do something like =, but this causes some problems with autograd and the modifications of tensors will not affect the computation graph (Release Trade-off memory for compute, Windows support, 24 distributions with cdf, variance etc., dtypes, zero-dimensional Tensors, Tensor-Variable merge, , faster distributed, perf and bug fixes, CuDNN 7.1 · pytorch/pytorch · GitHub).
Do you have any idea how can I do this or at least if that’s possible?

I don’t understand how inplace operations should be possible if the sizes differ.