Tensor slicing: out-of-range indices for the last dimension

Stumbled across something odd as I was playing around with tensor slicing.

a = torch.arange(20).view(4, 5)
# tensor([[  0.,   1.,   2.,   3.,   4.],
#         [  5.,   6.,   7.,   8.,   9.],
#         [ 10.,  11.,  12.,  13.,  14.],
#         [ 15.,  16.,  17.,  18.,  19.]])

If I use invalid indices for the first dimension (e.g., a[5:, :], a[4:4, :]), an error pops up (as expected).

# Traceback (most recent call last):
#   File "<stdin>", line 1, in <module>
# RuntimeError: dimension out of range (expected to be in range of [-1, 0], but got 1)

But when I use them for the second dimension, this happens:

a[:, 5:]
# tensor([  5.,  10.,  15.,   0.])
# Permutation of the first column?

a[:, 4:4]
# tensor([  4.,   9.,  14.,  19.])
# Squeezed final column

Is this intentional/does this serve a purpose? Thanks in advance!

I think this is an error with the bounds checking.