main.cpp: In member function ‘at::Tensor NetImpl::forward(at::Tensor)’:
main.cpp:36:25: error: ‘using element_type = class torch::nn::Module {aka class torch::nn::Module}’ has no member named ‘forward’
x = module->forward(x);
^~~~~~~
I am using libtorch version 1.6.0.dev20200415+cpu on linux.
main.cpp: In member function ‘at::Tensor NetImpl::forward(at::Tensor)’:
main.cpp:36:25: error: no match for call to ‘(const std::shared_ptr<torch::nn::Module>) (at::Tensor&)’
x = module(x);
It works only when I defined a Linear layer outside the ModuleList. However I need the layers to be in a ModuleList for the rest of my program.
ModuleList doesn’t store the modules’ type information, and we need to convert the modules to the concrete types for forward to work. So instead of doing module->forward(x), we should do module->as<Linear>()(x).
@ramsamy Sorry I made a mistake in my original answer, please make sure you are using module->as<Linear>()(x) instead of module->as<Linear>()->forward(x), otherwise module hooks (to be implemented) won’t work
This method seems not to work with a ModuleList of nn::Sequential. Is it possible to use nn::Sequential in the ModuleList, and if it is, then how to code the forward?
=============
EDIT:
Just found out, that
module->as<nn::Sequential>()->forward(x)
is working fine. However module->as<nn::Sequential>()(x) throws a compile-time error
Then I used the Unet2D as a model for a training phase.
I’ve encounter “aten::resize is not implemented” runtime error during the backward of the loss at line 134.
Does this runtime error relevent to the forward style?