Hi, I have an issue on creating a super class of autoencoder.
The following class causes an error of “NotImplementedError” in module.py when z = model(x) is evaluated.
I am working on Python 3.6 and got the same error on either Spyder and Prompt.
How can I solve this?
Thank you in advance
class Autoencoder(nn.Module): # NN module
def __init__(self):
super(Autoencoder, self).__init__() # Super class from Autoencoder
self.encoder = nn.Sequential(
nn.Linear(28*28, 128),
nn.ReLU(True),
nn.Linear(128, 64),
nn.ReLU(True),
nn.Linear(64, 12),
nn.ReLU(True),
nn.Linear(12, 2)
)
self.decoder = nn.Sequential(
nn.Linear(2, 12),
nn.ReLU(True),
nn.Linear(12, 64),
nn.ReLU(True),
nn.Linear(64, 128),
nn.ReLU(True),
nn.Linear(128, 28 * 28),
nn.Tanh()
)
def forward(self, x):
x = self.encoder(x)
x = self.decoder(x)
return x
File “”, line 1, in
runfile(‘C:/usr/local/Anaconda3/mylib/Autoencoder_test.py’, wdir=‘C:/usr/local/Anaconda3/mylib’)
File “C:\usr\local\Anaconda3\lib\site-packages\spyder\utils\site\sitecustomize.py”, line 705, in runfile
execfile(filename, namespace)
File “C:\usr\local\Anaconda3\lib\site-packages\spyder\utils\site\sitecustomize.py”, line 102, in execfile
exec(compile(f.read(), filename, ‘exec’), namespace)
File “C:/usr/local/Anaconda3/mylib/Autoencoder_test.py”, line 85, in
xhat = model(x)
File “C:\usr\local\Anaconda3\lib\site-packages\torch\nn\modules\module.py”, line 477, in call
result = self.forward(*input, **kwargs)
File “C:\usr\local\Anaconda3\lib\site-packages\torch\nn\modules\module.py”, line 83, in forward
raise NotImplementedError
Thank you for your kind reply. I tried it out, but unfortunately could not figure out the problem.
Instead, I tried another structure of class as follows, and then the code seems to be working!
I will also keep in mind the correct indent as you suggest.
I really appreciate your help, albanD and InnovArul
I have also faced this problem, and although I tried the solutions, I kept getting the same error.
I restarted the notebooks kernel. and it tried the solutions again. It WORKED . I just wanted to mention this maybe one of you face the same thing
I have had the same error because I have been passing the input of the model directly to a nn.ModuleList object, instead of passing it through each of its contained modules.
So my original code looked like:
class Discriminator(nn.Module):
def __init__(self, hidden_dim = 128, depth = 3, in_dim = 200):
super(Discriminator, self).__init__()
# parameter
self.in_dim = in_dim
# model arch.
self.layers = nn.ModuleList([nn.Linear(in_dim, hidden_dim),
nn.BatchNorm1d(hidden_dim),
nn.LeakyReLU()])
for i in range(0, depth-1):
self.layers.extend([nn.Linear(hidden_dim//2**i,hidden_dim//2**(i+1)),
nn.BatchNorm1d(hidden_dim//2**(i+1)),
nn.LeakyReLU()])
self.layers.extend([nn.Linear(hidden_dim//2**(depth-1), 1)])
def forward(self, x):
x = x.view(-1, self.in_dim)
return self.layers(x)
And to fix it I just replaced the last line with the last line to: