Meaning of torch::autograd:: prefix in autograd profiller name

Hi,

I have a question about autograd profiler return. I’m trying to link each row of the log to the operations in my python code, and I wonder what is the meaning of the operations tagged with torch::autograd:: ?

Example:

MseLossBackward
SqueezeBackward0
torch::autograd::CopyBackwards
ViewBackward
AddBackward0
UnsafeViewBackward
BmmBackward0
ExpandBackward
CloneBackward
InverseBackward
MmBackward
TransposeBackward0
SubBackward0
UnsqueezeBackward0
SelectBackward
SliceBackward
MulBackward0
ReluBackward0
NativeBatchNormBackward
SqueezeBackward1

Thanks !