Long underflow/overflow?

What could be going wrong here?

converting a scalar to a LongTensor returns an arbitrarily large value.
image

This is the source.

image

A bug?

Changing the source to id = torch.LongTensor([bos_id]) works fine.

You are creating a LongTensor of the size 1 using uninitialized memory.
Using the “tensor type” constructors is not recommended, as it’s creating this unexpected behavior (the same applies for e.g. torch.FloatTensor).
E.g. if you are passing a scalar value of 10 you would get a tensor with 10 uninitialized values:

torch.LongTensor(10)
tensor([ 94443093879952,  94439303731984, 140676049261888,               0,
                      0,               0,               0,               0,
                      0,               1])

Use torch.tensor(bos_id) instead (which will automatically derive the dtype) or specify the dtype explicitly torch.tensor(bos_id, dtype=torch.long).

1 Like

I see. Indeed, the docs don’t use those constructors explicitly. torch.tensor seems the way to go. Thanks!