How to use forward on modules in a ModuleList?

Hi,
I have a net composed of layers in a nn::ModuleList. I didn’t succeed to apply the forward method on the modules in the list. How can I use them?

Here is the code to reproduce the issue:

#include <torch/torch.h>

using namespace torch;

struct LinearImpl : nn::Module
{
    LinearImpl(int in_features, int out_features, bool bias):
        linear_layer(nn::LinearOptions(in_features, out_features).bias(bias))
    {
        register_module("linear_layer", linear_layer);
    }

    torch::Tensor forward(torch::Tensor x)
    {
        x = linear_layer(x);
        return x;
    }

    nn::Linear linear_layer;
};
TORCH_MODULE(Linear);

struct NetImpl : nn::Module
{
    NetImpl(int in_features, int out_features, bool bias):
        layers(Linear(in_features, out_features, bias),
               Linear(out_features, out_features, bias))
    {
        register_module("layers", layers);
    }

    torch::Tensor forward(torch::Tensor x)
    {
        for (const auto &module : *layers)
        {
            x = module->forward(x);
        }
        return x;
    }

    nn::ModuleList layers;
};
TORCH_MODULE(Net);

int main()
{
    Net net(32, 64, false);

    torch::Tensor inputs = torch::randn(32);
    torch::Tensor outputs = net(inputs);

    std::cout << outputs << std::endl;

    return 0;
}

And the error I have during compilation:

main.cpp: In member function ‘at::Tensor NetImpl::forward(at::Tensor)’:
main.cpp:36:25: error: ‘using element_type = class torch::nn::Module {aka class torch::nn::Module}’ has no member named ‘forward’
             x = module->forward(x);
                         ^~~~~~~

I am using libtorch version 1.6.0.dev20200415+cpu on linux.

Hi,

I think you should be calling the module directly like you do in python (not call the forward function directly).

No, I tried this also but I have this error:

main.cpp: In member function ‘at::Tensor NetImpl::forward(at::Tensor)’:
main.cpp:36:25: error: no match for call to ‘(const std::shared_ptr<torch::nn::Module>) (at::Tensor&)’
             x = module(x);

It works only when I defined a Linear layer outside the ModuleList. However I need the layers to be in a ModuleList for the rest of my program.

Not sure what is happening here
@yf225 will know :slight_smile:

ModuleList doesn’t store the modules’ type information, and we need to convert the modules to the concrete types for forward to work. So instead of doing module->forward(x), we should do module->as<Linear>()(x).

2 Likes

@yf225 That solves the problem, thank you!

@ramsamy Sorry I made a mistake in my original answer, please make sure you are using module->as<Linear>()(x) instead of module->as<Linear>()->forward(x), otherwise module hooks (to be implemented) won’t work :slight_smile:

This method seems not to work with a ModuleList of nn::Sequential. Is it possible to use nn::Sequential in the ModuleList, and if it is, then how to code the forward?

=============
EDIT:
Just found out, that
module->as<nn::Sequential>()->forward(x)
is working fine. However module->as<nn::Sequential>()(x) throws a compile-time error

1 Like

What if we create the ModuleList dynamically, and have no further knowledge about the actual sub-types?

At the end I had to create an abstract class with a pure virtual function forward and let my modules derive from it.

I tried the forward style as @yf225 mentioned above, it’s not working.

“encoder_” is a torch::nn::ModuleList with some DoubleConv in it.

forward style in line 79 passed compilation but not sure is it implemented correctly?

Then I used the Unet2D as a model for a training phase.
I’ve encounter “aten::resize is not implemented” runtime error during the backward of the loss at line 134.

Does this runtime error relevent to the forward style?

@yf225 This solution solves when all the Objects are of same dtype. Could you point out what should be done when there are more than one dtype in ModuleList?