TypeError: torch.nn.modules.activation.ReLU is not a Module subclass

import torch.nn as nn
import torch.nn.functional as F

class Net(nn.Module):
def init(self):
super(Net, self).init()

    self.layer1=nn.Sequential(
        nn.Conv2d(in_channels=3, out_channels=16, kernel_size=3, stride=1, padding=1),
        nn.ReLU(),
        nn.BatchNorm2d(16),
    )

When I run the above code I get the below error.

TypeError: torch.nn.modules.activation.ReLU is not a Module subclass

2 Likes

Which PyTorch version are you using? (You can see it via print(torch.__version__)).
The code is running fine on my system.

print(torch.version)

1.2.0

How did you install PyTorch? Could you create a new virtual environment and reinstall it?

Do i have to uninstall first?
Can you please help best practice for creating virtual Env.

You don’t need to change your current setup, if you create a new virtual environment.

I’m personally using conda, as I think their env setup is convenient to switch quickly and in the worst case, you can just delete a “broken” environment.

  • download Anaconda
  • create a new env via: conda create -n my_env_name Python=3.7 Anaconda (the last Anaconda argument pulls in a lot of libs like numpy, matplitlib etc., so you might want to skip it)
  • activate the env via: conda activate my_env_name
  • install the PyTorch binaries

Created new env and installed pytorch libraries

(eva-4) C:\Users\gajanana_ganjigatti>python -m ipykernel install --user --name=my_env_name
Installed kernelspec eva-4 in C:\Users\gajanana_ganjigatti\AppData\Roaming\jupyter\kernels\my_env_name

getting the same error below.

in init(self)
20 nn.ReLU,
21 nn.BatchNorm2d(64),
—> 22 nn.MaxPool2d(2, 2),
23 )
24

C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\container.py in init(self, *args)
51 else:
52 for idx, module in enumerate(args):
—> 53 self.add_module(str(idx), module)
54
55 def _get_item_by_idx(self, iterator, idx):

C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\module.py in add_module(self, name, module)
192 if not isinstance(module, Module) and module is not None:
193 raise TypeError(“{} is not a Module subclass”.format(
→ 194 torch.typename(module)))
195 elif not isinstance(name, torch._six.string_classes):
196 raise TypeError(“module name should be a string. Got {}”.format(

TypeError: torch.nn.modules.activation.ReLU is not a Module subclass

Could you make sure you are using the posted code snippet and do not forget to create an instance of nn.ReLU?
This code will yield your error:

layer1=nn.Sequential(
    nn.Conv2d(in_channels=3, out_channels=16, kernel_size=3, stride=1, padding=1),
    nn.ReLU, # missing () !!!
    nn.BatchNorm2d(16),
)

while your posted code should work.

3 Likes

thanks a ton. Silly mistakes made me create another env.

issue fixed :slight_smile: Cowabunga :muscle:

One question!
I’m not able to see my activation modules when I do

print(model)

only convolution, batchnorm, avgpool, and etc.
Is there any way to see my activation layer?

I want to load parameters from a pretrained model and just change the activation layers (sigmoid to ReLU for example…). Would this be achievable??

You should be able to see the activation modules, if you’ve created them as modules.
Note that functional API calls won’t be printed in this way, i.e. if you’ve used F.relu() in the forward.
Could this be the case for your model?

1 Like

Instead of using nn.ReLU in Sequential block, create an instance of nn.ReLU and use that instance inside the sequential block.
Eg…

relu1 = nn.ReLU()
self.layer1=nn.Sequential(
nn.Conv2d(in_channels=3, out_channels=16, kernel_size=3, stride=1, padding=1),
relu1,
nn.BatchNorm2d(16),
)