Can I somehow attribute memory allocations recorded with the pytorch profiler with the associated kernels?
When using the profiler, I can see the allocation of individual functions (such as aten::empty
) but I do not know how to link this to individual kernels. A view like the traces with memory allocations instead of runtime would be helpful.
I see that there is a lot of development in pytorch’s main branch for the profiler, torch.profiler — PyTorch main documentation (in particular torch.profiler — PyTorch main documentation ). The function seems to return a JSON with two arrays, but I have not been able to use it to locate the memory allocations as I would like to.