Controlling requires_grad

When i have a tensor with requires_grad=False, some operations appear to set requires_grad=True on the output.

For example, this is the case for torch.conv2d, but not the case for torch.relu.

The documentation for the operators does not seem to indicate which behavior will be used.

Is there documentation for which operators exhibit this behavior? Is there some way to control it, other than disabling requires_grad on the output?

Thanks.