Moving a module to a device

I am trying to fix: “RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!” from a torchscript module.

Is there a convenient way to move a whole module onto a particular device? I’ve tried m.to(torch.device('cuda')) and m.cuda()

Here is a minimal (not quite working) example:

class TestModule(nn.Module):
    def __init__(self, net):
        super(TestModule, self).__init__()
        self.net = net
        self.a = torch.rand(2)

    def forward(self, x):
        s = self.net(x)
        return self.a * s

m = TestModule(net)

Initializing with a network on the cpu:
m.net[-1].weights → cpu
m.a.devicedevice(type='cpu')

Then:
m = m.to(torch.device('cuda'))
m.net[-1].weightdevice='cuda:0'
m.a.devicedevice(type='cpu'))

Thanks!

For an nn.Module, the .to(device) function will send all valid members of the module to the device.
The valid members for this operation is other nn.Modules, Parameters and Buffers.

For your code, these changes would be required.

class TestModule(nn.Module):
    def __init__(self, net):
        super(TestModule, self).__init__()
        self.net = net
        self.a = nn.Parameter(torch.rand(2))

    def forward(self, x):
        s = self.net(x)
        return self.a * s

m = TestModule(net)
m.to(device)
##m will be on "device"
1 Like