Suggestion to make masked_scatter_() a torch function

Hi all,
Do you think make masked_scatter_() a torch function (torch.masked_scatter()) is a good idea? As it is now an in_place function, sometimes the “in-place operations can be only used on variables that don’t share storage with any other variables, but detected that there are x objects sharing it” will occur.
Thank you!

Hi, you can always do the following to do an out of place operation:

new_t = t.clone().masked_scatter_(...)
1 Like