I am loading a large serialized file (about 4.2GB) and encountered some error. With a smaller dataset, it was ok.
Any hints about this? Thank you.
`
Traceback (most recent call last):
File “train.py”, line 509, in
main()
File “train.py”, line 409, in main
dataset = torch.load(opt.data)
File “Y:\user\v\pytorch\Anaconda3\lib\site-packages\torch\serialization.py”, line 229, in load
return _load(f, map_location, pickle_module)
File “Y:\user\v\pytorch\Anaconda3\lib\site-packages\torch\serialization.py”, line 384, in _load
deserialized_objects[key]._set_from_file(f, offset)
RuntimeError: storage has wrong size: expected 77201408 got 13
I meet the similar question,have you solved your problem?
loading pretrained model models/resnet-152-kinetics.pth
Traceback (most recent call last):
File “main.py”, line 40, in
model, parameters = generate_model(opt)
File “/home/zhaoyan/self_3D-ResNets-PyTorch_8/model.py”, line 106, in generate_model
pretrain = torch.load(opt.pretrain_path)
File “/usr/local/lib/python2.7/dist-packages/torch/serialization.py”, line 261, in load
return _load(f, map_location, pickle_module)
File “/usr/local/lib/python2.7/dist-packages/torch/serialization.py”, line 416, in _load
deserialized_objects[key]._set_from_file(f, offset)
RuntimeError: storage has wrong size: expected -5033425001016997258 got 512