Tensor's reference vs name

hi, all,
for a simple instance:

class MyModule(nn.Module):

    def __init__(self):
        super(MyModule, self).__init__()

        self.register_parameter('t', nn.Parameter(torch.tensor([3.])))


net = MyModule().cuda()

print(net.t, net.t.name, net.t.device)

and the output is

tensor([3.], device='cuda:0', requires_grad=True) None cuda:0

I wonder the parameter name for function register_parameter is indeed a reference. So what does tensor’s property name actually means?

Hello :smiley:
I believe that the name is just what you want to name the attribute within the module so that you could, in fact, do the print(net.t). So instead of doing self.t = nn.Parameter(…) you can do it the way you did it. Not sure if there is any difference.

Or is that what you are asking?