In C++, if I construct a torch tensor from another tensor,
Tensor(Tensor &) (with one ampersand!) does that copy the element data from the source tensor into a new location for the new tensor? Or is it more like a reference where a subsequent update to the elements of either tensor will be seen in the other? Or is it more like a lazy copy, where they share element data until either is modified at which point a copy is made so that they no longer share data?
Is the behavior for an assignment of a tensor to a tensor different from the construction of a tensor from another? If so, how?
I’m not a specialist of the c++ API but a good mental model is
Tensor = shared_ptr<ActualTensor>.
So all the reference/const constructs from c++ will do the same thing as they would on a shared_ptr.
Note that you can also have Tensor being views of each other:
auto t2 = t1.alias(); for example. t1 and t2 are two independent ActualTensor objects. But they will share the same memory for their content. And change to the data of one will also change the other. But change to metadata won’t be reflected.
Thanks! Where you write “I’m not a specialist of the c++ API” are you being modest or should I seek out a second opinion? If the latter … are there any second opinions out there?
I don’t think you will find a simpler answer than that I’m afraid.
If you have specific questions, you can ask glaringlee for the c++ api or ezyang for more general backend questions. But they are quite busy and might be less responsive.
I like your answer. Thank you.