Hello. I’m implementing my model on multi gpus. But it throws an error “AttributeError: ‘DataParallel’ object has no attribute ‘loss’”. So I want to know whether the DataParallel only duplicate the forward() function of our model ? And isn’t the loss() function duplicated? So, how should I do? Thanks a lot.
class MyModule1(nn.Module):
def __init__(self):
super(MyModule1, self).__init__()
self.fc1 = nn.Linear(2, 1)
def forward(self, x):
x = self.fc1(x)
return x
def loss(self, x):
loss = ......
return loss
model1 = MyModule1()
model1 = DataParallel(model1)
model1.to('cuda:0')
x = torch.tensor([10, 5], dtype=torch.float).to('cuda:0')
print model1.loss(x)
Because, model1 is now an object of class DataParallel, and it indeed does not have such a function or attribute.
You should do model1.module.loss(x)
But, then, it will run only on one GPU.