Difference between torch.Tensor and torch.empty?

Can you clarify what “uninitialized” means?

Perhaps I am being pedantic, but I see the implementation returns small numbers rather than zero’s. Is the semantics of uninitialized intended to be zero?

I guess what I am really after is why it’s not exactly zero.

I printed some outputs and it was more confusing than clarifying

>>> torch.empty(3, dtype=torch.long)
tensor([     140187750138624,      140187750063648, -5764607523034234880])
>>> torch.empty(3)
tensor([0.0000e+00, 1.4013e-45, 0.0000e+00])

Questions:

  1. the second one is trying to be zero. Why can’t it be exactly zero? I thought it was possible to represent zero exactly in float/double etc.
  2. why is the first one nothing resembling zero? What is the intended semantics there?
1 Like