# How to understand "at::"?

Here if input.is_mkldnn(), call at::mkldnn_adaptive_avg_pool2d and others call _adaptive_avg_pool2d. Why mkldnn need add at::? What’s the difference between those.

``````  Tensor adaptive_avg_pool2d(at::Tensor const& input, IntArrayRef output_size) {
if (input.is_mkldnn()) {
}

// TODO: fastpath for Channels_last should be explored later;
if (input.suggest_memory_format() == at::MemoryFormat::Contiguous && !input.is_quantized() && output_size[0] == 1 && output_size[1] == 1) {
// in this case, adaptive pooling is just computing mean over hw
// dimensions, which can be done more efficiently
int64_t mean_size = input.size(-1) * input.size(-2);
Tensor out = input.contiguous().view({-1, mean_size}).mean(-1);
return input.dim() == 3 ? out.view({input.size(0), 1, 1})
: out.view({input.size(0), input.size(1), 1, 1});
} else {
}
}
``````

`at::` is a namespace in the libtorch.
I think its usage might not even be needed in this particular line of code.

The functions will dispatch to the mentioned functions here in case you would like to follow the calls.

when I create this function

``````namespace at{ namespace native{
……
Tensor conv2d_my(

const Tensor& input,

const Tensor& weight,

const Tensor& bias,

IntArrayRef stride,

IntArrayRef dilation,

int64_t groups)

{

return convolution(input, weight, bias, stride, padding, dilation,

false, {{0, 0}}, groups);

}))
``````

When I return convolution, it will report

``````error: call of overloaded ‘convolution(const at::Tensor&, const at::Tensor&, const at::Tensor&, c10::IntArrayRef&, c10::IntArrayRef&, c10::IntArrayRef&, bool, <brace-enclosed initializer list>, int64_t&)’ is ambiguous
``````

If I add “at::”, it will pass.

Then you would need to specify the namespace in your code snippet.
I don’t know, if `mkldnn_adaptive_avg_pool2d` is ambiguous.

They are all in “at::native::”