When I try to run my net forward with: net.forward(content_image)
, I get the following error:
Traceback (most recent call last):
File "st.py", line 150, in <module>
net.forward(content_image)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/container.py", line 67, in forward
input = module(input)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 357, in __call__
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 71, in forward
raise NotImplementedError
NotImplementedError
Running net(content_image)
results in:
File "st.py", line 150, in <module>
net(content_image)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 357, in __call__
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/container.py", line 67, in forward
input = module(input)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 357, in __call__
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 71, in forward
raise NotImplementedError
NotImplementedError
And running net.updateOutput(content_image)
results in:
Traceback (most recent call last):
File "st.py", line 150, in <module>
net.updateOutput(content_image)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 398, in __getattr__
type(self).__name__, name))
AttributeError: 'Sequential' object has no attribute 'updateOutput'
My loss modules are setup like this:
class ContentLoss(nn.Module):
def __init__(self, strength, normalize):
super(ContentLoss, self).__init__()
def updateOutput(self, input):
return self.output
def updateGradInput(self, input, gradOutput):
return self.gradInput
When setting up my sequential network, I first add a layer from a pre-trained model from torchvision
:
net = nn.Sequential()
net.add_module(layer_name, layer)
Then I add my loss modules:
loss_module = ContentLoss(content_weight, norm)
net.add_module(layer_name, loss_module)
My full code can be found here:
So what am I doing wrong here? And how do I fix the issue? Is it an issue with how I’ve setup my custom loss modules? Or how I am trying to run the network forward?
Edit
I found the solution here: NotimplementedError while densenet implementation
Unlike torch.legacy.nn
, nn.Module
requires the sub module name forward
instead of updateOutput
.
Second Edit:
Though my self.output
variable is not being passed correctly to my GramMatrix loss function, when it should have been created in my ContentLoss function:
AttributeError: 'GramMatrix' object has no attribute 'output'