When:
- using
torch.nn.EmbeddingBag
(doc) - without an offset but with a 2D (B x N) batch
- with cuda
- on a gpu different than 0
we get an " arguments are located on different GPU " error.
import torch
from torch.autograd import Variable
e = torch.nn.EmbeddingBag(10, 3)
a = Variable(torch.LongTensor([[1,2,3],[4,5,6]]))
e(a) # OK
e.cuda(0)
a = a.cuda(0)
e(a) # OK
e.cuda(1)
a = a.cuda(1)
e(a) # ERROR
torch.cuda.set_device(1)
e(a) # OK
Is that a bug ?
The trick to use torch.cuda.set_device(1)
works, but that shouldn’t be necessary according to me.