Problem on types of tensor

Input is n tensors, and I define my network as following:

generators = []
for i in range(n):
    main = nn.Sequential()
    main.add_module("some_module", some_module)
    generators.append(main)

And in forward part, each tensor from input is multiplied with generators[i] to produce output. However, error message “RuntimeError: Input type (CUDAFloatTensor) and weight type (CPUFloatTensor) should be the same” is produced when I run the code. It seems that “generators” is not added to my model, and thus weights inside “generators” are not initialized to be CUDA tensor. Is there a way to initialize weights to be CUDA tensor? Thanks.

There are at least a couple of ways to add a module to your model…

  1. model.name = module automagically adds the module
  2. model.add_module(name, module)

Your code does neither for generators, so therefore generators isn’t added to your model, which as you say, is the source of your problem. You could apply either of the above suggestions, but might I suggest the following cleaner possibility?

generators = nn.ModuleList() # behaves like a python list, but adds its elements to the model.
for i in range(n):
    main = nn.Sequential()
    main.add_module("some_module", some_module)
    generators.append(main)

N.B. nn.ModuleList doesn’t have a .forward method, so you will need to loop over its contents in your model’s .forward method.

Your example code adds several modules of the same name to main. Does this work as expected?
I would have done main.add_module("some_module_"+str(i), some_module) to avoid any ambiguities.

Besides if you don’t need to name the submodules the following one-liner should work just as well.

generators.append(nn.Sequential(*[some_module for i in range(n)]))
1 Like

Thanks! ModuleList is exactly what I am looking for.