Today I received a simple binary class dataset. After training some models I tried to change the labels to one-hot. There I realized that some labels where wrong.
For instance, the numpy.array has the next values (in format int64):
`
-9223372036854775808 (obviously wrong label)
0
1
`
That when converted to torch tensor using torch.from_numpy results in:
`
-9.2234e+18 (wrong label.)
0
1
`
Even more strange is that I can minimize the categorical cross entropy which normally throws error if the labels provided are not in [0,C-1].
I made some checkings. For instance if I put the tags from both classes to lets say -1 and -2 I get error:
Assertion `t >= 0 && t < n_classes` failed.
If I put the label one to 1 and the other to -1 it throws error. However If I put one label to 1 and the other to -9223372036854775808 everything works perfectly.
What is happening?