Cryptic error out parameter

I’m getting an error like Torch.stack cryptic error when using out= parameter, but the difference is, I don’t use any out= in my code. I’m having a dataloader.

>>trainloader = DataLoader(dset, batch_size=bs, num_workers=6)
>>for num, train_dict in enumerate(trainloader):
>>  break
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-92-03b8fd8e9dd0> in <module>
----> 1 for num, train_dict in enumerate(trainloader):
      2     break
/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/dataloader.py in __next__(self)
    817             else:
    818                 del self.task_info[idx]
--> 819                 return self._process_data(data)
    820 
    821     next = __next__  # Python 2 compatibility
/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/dataloader.py in _process_data(self, data)
    844         self._try_put_index()
    845         if isinstance(data, ExceptionWrapper):
--> 846             data.reraise()
    847         return data
    848 
/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/_utils.py in reraise(self)
    367             # (https://bugs.python.org/issue2651), so we work around it.
    368             msg = KeyErrorMessage(msg)
--> 369         raise self.exc_type(msg)
RuntimeError: Caught RuntimeError in DataLoader worker process 0.
Original Traceback (most recent call last):
  File "/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
    data = fetcher.fetch(index)
  File "/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 47, in fetch
    return self.collate_fn(data)
  File "/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/_utils/collate.py", line 75, in default_collate
    return {key: default_collate([d[key] for d in batch]) for key in elem}
  File "/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/_utils/collate.py", line 75, in <dictcomp>
    return {key: default_collate([d[key] for d in batch]) for key in elem}
  File "/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/_utils/collate.py", line 56, in default_collate
    return torch.stack(batch, 0, out=out)
RuntimeError: stack(): functions with out=... arguments don't support automatic differentiation, but one of the arguments requires grad.
  File "/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/_utils/collate.py", line 75, in <dictcomp>
    return {key: default_collate([d[key] for d in batch]) for key in elem}
  File "/scratch1/eh22/conda3/envs/py31/lib/python3.6/site-packages/torch/utils/data/_utils/collate.py", line 56, in default_collate
    return torch.stack(batch, 0, out=out)
RuntimeError: stack(): functions with out=... arguments don't support automatic differentiation, but one of the arguments requires grad.

You are returning tensors that require_grad in your data set code.