Torch.scatter_() with scalar src works differently parameter name specified & not specified

It seams that torch.scatter_() with scalar src parameter name specified works differently from that the name is not specified.

For example:

x = torch.ones(2, 10)
idx = torch.tensor([[3, 4], [5, 6]]).long()
# 1. when parameter name `src` is not specified
x.scatter_(-1, idx, 0)
"""
x is:
tensor([[1., 1., 1., 0., 0., 1., 1., 1., 1., 1.],
        [1., 1., 1., 1., 1., 0., 0., 1., 1., 1.]])
"""
# 2. when parameter name `src` is specified
x.scatter_(-1, idx, src=0)
"""
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Input In [35], in <cell line: 1>()
----> 1 x.scatter_(-1, idx, src=0)

TypeError: scatter_() received an invalid combination of arguments - got (int, Tensor, src=int), but expected one of:
 * (int dim, Tensor index, Tensor src)
      didn't match because some of the arguments have invalid types: (int, Tensor, !src=int!)
 * (int dim, Tensor index, Tensor src, *, str reduce)
 * (int dim, Tensor index, Number value)
      didn't match because some of the keywords were incorrect: src
 * (int dim, Tensor index, Number value, *, str reduce)
"""
# 3. test case
x.scatter_(-1, idx, value=0)
"""
x is:
tensor([[1., 1., 1., 0., 0., 1., 1., 1., 1., 1.],
        [1., 1., 1., 1., 1., 0., 0., 1., 1., 1.]])
"""

Yes, it’s not a bug and it’s because the parameter src and value are separated in the valid combination of arguments. But even though it’s not a bug, it could confuse users and I cannot deduce why these two parameter names should be separated. If it is internally necessary one, then I think the documentation should mention about value parameter.

Thank you.

Thanks for raising the potentially confusing docs! A contribution to improve the documentation is more than welcome, so would you be interested in improving it? :slight_smile:

1 Like