Output variables in functions

Coming from C++ I am used to the fact that the syntax

def foo(x):
  change x ...

copies the argument that is passed on the call site into the local variable x.

However in Pytorch I believe that reference semantics implicitly cause any value that is assigned on the call site to be modified as well when x is changed inside the function.

Is that true? or does this only hold for numpy/pytorch etc.?

and if so, does this render return values vastly obsolete?

Here is my view: In C/C++/Java, the pass by value (elementary values such as int, char, float, etc) paradigm works the way you described by copying the values to the stack space of the API.

But pass by reference (pointers, objects, etc) paradigm works by passing just the reference to the API to avoid the copying overload/memory efficiency. The object’s reference gives the ability to the called function to change the data passed as its parameters.

Even in python, these paradigms works the same (as far as I know). The numpy arrays, pytorch tensors are passed by reference, where as the simple data elements (numbers, string etc) are passed by value. Hence, the functions are able to modify the parameters of type numpy arrays, pytorch tensors.

When it comes to the question of return values, it is mostly about the programming style, guidelines, and convenience. Of course, we can eliminate the return values and have a placeholder for the return value in function parameters. It is not prevalently followed style though.