Where can I find at::clamp_min funciton?

I found that the torch.nn.functional.relu function calls torch.relu and ‘torch.relu_’

def relu(input: Tensor, inplace: bool = False) -> Tensor:
    r"""relu(input, inplace=False) -> Tensor
    Applies the rectified linear unit function element-wise. See
    :class:`~torch.nn.ReLU` for more details.
    """
    if has_torch_function_unary(input):
        return handle_torch_function(relu, (input,), input, inplace=inplace)
    if inplace:
        result = torch.relu_(input)
    else:
        result = torch.relu(input)
    return result


relu_ = _add_docstr(
    torch.relu_,
    r"""
relu_(input) -> Tensor
In-place version of :func:`~relu`.
""",
)

And I finally got this function at::clamp_min(self, 0). But I’m not sure where it is mentioned on GitHub.

Tensor relu(const Tensor & self) {
  TORCH_CHECK(self.scalar_type() != at::kBool, "Boolean inputs not supported for relu");
  return at::clamp_min(self, 0);
}

Tensor & relu_(Tensor & self) {
  TORCH_CHECK(self.scalar_type() != at::kBool, "Boolean inputs not supported for relu");
  return at::clamp_min_(self, 0);
}

Could you please provide me with the GitHub address where this function can be found?

The CPU kernel can be found here and the CUDA kernel here.

1 Like