Why do some of the function in nn.functional returns variable and some do not?

I am using some of the functions provided by torch.nn.functional. The input to the function is a Tensor and I expect that the output is also a Tensor. But it seems that some of the functions will output torch.autograd.Variable, which seems strange.

For example, if the input x is a tensor, nn.functional.relu(x) will return a Variable, while function like nn.functional.normalize(x) will return a Tensor.

Why is there a distinction? What is the rationale behind this?


nn.functional actually expects Variables as inputs.
The behavior when you input Tensors is not defined I think (the result will be correct, but the type of what you get as output will depend on the way it was implemented).

Thanks for clarification.