I mean convert argument like {"key": torch.tensor([1., 2.])}
to {"key": [1., 2.]}"
Just make sure there exits no tensor in the argument you pass into forward() which requires to be copied.
My input argument types for forward
looks like tensor, tensor, int
… But I still can not get DataParallel
work….
Can you have a try on not passing int
into forward
then test whether the dataparallel works?
Yep, thanks a lot.
I have solved my problem, it’s true that I have passed the non-tensor type data for model forward computation. As you stated, “All tensors will be scattered on dim specified (default 0)”, so it’s better to format forward tensor data with the first dimension represent the sampling batch.
Have a good day!