torch.LongTensor doesn't initialize to 0 above certain size

Creation of a LongTensor with size < 74240 yields a tensor filled with random long int noise. Above 74240 yields a tensor filled with uniform 0s. Any idea why this is?

Python 3.6.5 | packaged by conda-forge | (default, Apr  6 2018, 13:44:09)
Type 'copyright', 'credits' or 'license' for more information
IPython 6.4.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]: import torch

In [2]: max(torch.LongTensor(74240))

In [3]: max(torch.LongTensor(74241))


Creating a Tensor will not initialize the memory. It will contain whatever the memory contained when it was returned by the allocator.
In your case I guess that you have an allocator for small objects that don’t initialize memory and one for large objects that does zero-out all the values.

Seems like Pytorch 0.4.0 doesn’t have this issue