Hi,
I’ve googled this problem, but most people run into it in the loss calculation step. But mine happened during the generation of batch, I don’t know what went wrong.
Blockquote
Traceback (most recent call last):
File “dtd_rnn_training.py”, line 469, in
training(args)
File “dtd_rnn_training.py”, line 212, in training
for batch_idx, batch in enumerate(train):
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/dataloader.py”, line 517, in next
data = self._next_data()
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/dataloader.py”, line 1199, in _next_data
return self._process_data(data)
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/dataloader.py”, line 1225, in _process_data
data.reraise()
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/_utils.py”, line 429, in reraise
raise self.exc_type(msg)
RuntimeError: Caught RuntimeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/_utils/worker.py”, line 202, in _worker_loop
data = fetcher.fetch(index)
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py”, line 47, in fetch
return self.collate_fn(data)
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/_utils/collate.py”, line 73, in default_collate
return {key: default_collate([d[key] for d in batch]) for key in elem}
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/_utils/collate.py”, line 73, in
return {key: default_collate([d[key] for d in batch]) for key in elem}
File “/home/anaconda3/envs/aa/lib/python3.8/site-packages/torch/utils/data/_utils/collate.py”, line 55, in default_collate
return torch.stack(batch, 0, out=out)
RuntimeError: result type Double can’t be cast to the desired output type Long