Tensors don’t know their names, and they might not all have names.
What you can do - at the expense of speed - if your tensors require gradients is to use the anomaly mode to get the lines of the instantiations:
with torch.autograd.detect_anomaly():
a = torch.randn(5, 5, requires_grad=True)
b = a * 2 + 1
then
b.grad_fn.metadata['traceback_'][-1]
has
' File "<ipython-...>", line 3, in <module>\n b = a * 2 + 1\n'
This only works on tensors inside the autograd graph.
Another approach (with better coverage) could be to use the __torch_function__ hook to record the source line from where the tensor is created.