What do the underscores at the end of functions mean in pytorch?

I’ve been seeing underscores at the end of many functions in pytorch. What does this mean?

I think I read in some chat or question that it means operations are done “in place” but I wanted to make sure that was correct (and provide a way for future people to search for this question) and to clarify what “in place” meant.

1 Like

This has been replied in How does one make sure that the parameters are update manually in pytorch using modules?

1 Like

From fmassa’s answer somewhere else (source: How does one make sure that the parameters are update manually in pytorch using modules?):

Whenever we have an underscore in the end of a function in pytorch, that means that the function is in-place.

If you you don’t know what in place means see albanD’s answer (source: What is `in-place operation`?):

An in-place operation is an operation that changes directly the content of a given Tensor without making a copy. Inplace operations in pytorch are always postfixed with a _ , like .add_() or .scatter_() . Python operations like += or *= are also inplace operations.

as a side comment, these sort of operations don’t get added to the computation graph, so they should be used with caution.

3 Likes