One error about the utils.py's code

today,when I use python 3.7, pytorch 1.2 and torchtext 0.4.0, a error occur.
OverflowError: Python int too large to convert to C long.
reason is about : maxInt = sys.maxsize > somevalue, in utils.py code is like this:
maxInt = sys.maxsize
while True:
# decrease the maxInt value by factor 10
# as long as the OverflowError occurs.
try:
csv.field_size_limit(maxInt)
break
except OverflowError:
maxInt = int(maxInt / 10)

csv.field_size_limit(sys.maxsize)

I correct this codes as follows:
maxInt = sys.maxsize
while True:
# decrease the maxInt value by factor 10
# as long as the OverflowError occurs.
try:
csv.field_size_limit(maxInt)
break
except OverflowError:
maxInt = int(maxInt / 10)

# csv.field_size_limit(sys.maxsize)
csv.field_size_limit(maxInt)

then, the erroe is disapper.!
codeerror|690x419

Is this error related to PyTorch or some other package to read csv files?

This error related to torchtext package to red csv files.
The function:
train, valid, test = data.TabularDataset.splits(path=’…/CSVdata’,
train=‘tr_cut_csv.csv’,
validation=‘va_cvs.csv’,
test=‘te_cut_csv.csv’,
format=‘csv’,
skip_header=True,
fields=[(‘text’, TEXT),(‘label’, LABEL)]
)
when I debug this function, it will jump to this functon
def unicode_csv_reader(unicode_csv_data, **kwargs) which in the torchtext utils.py.
Above all, this error occurs in my computer. and I think the logisitc is not good in the code:
maxInt = sys.maxsize
while True:
# decrease the maxInt value by factor 10
# as long as the OverflowError occurs.
try:
csv.field_size_limit(maxInt)
break
except OverflowError:
maxInt = int(maxInt / 10)

csv.field_size_limit(sys.maxsize)

it had better to be changed for :
maxInt = sys.maxsize
while True:
# decrease the maxInt value by factor 10
# as long as the OverflowError occurs.
try:
csv.field_size_limit(maxInt)
break
except OverflowError:
maxInt = int(maxInt / 10)
csv.field_size_limit(maxInt)

also,this error is not about pytorch. sorry!