I’ve trained a DenseNet121 model, and am trying to get ahold of one of the final layers for a GradCAM implementation. The implementation uses model._modules[] as a dict from layer name to layer, but for some reason, my model has only one key-value pair in this dict: “module” to the entire DenseNet object. What should I do instead? My version is 1.2.0.
I’m looking for the a layer to create class activation maps, so probably one of the last ones like “module.densenet121.features.denseblock4.denselayer16.conv.2” or “module.densenet121.features.norm5”
My model is wrapped in torch.nn.DataParallel(model). How do I get “under the hood” to access the Densenet? I think this is probably the issue with the first part of the issue
I am working on classification of five different types of images of skin and I trained the AlexNet model and GoogleNet Model. Now I wanted to implement the Grad-CAM on the final layer of the these models and want to get the heat-map.
after reading many blogs and coding material I reach here. This code is specifically use resnet50 model.
model = models.resnet50(pretrained=True)
grad_cam = GradCam(model=model, feature_module=model.layer4, \
target_layer_names=["2"], use_cuda=args.use_cuda)
How should I pass the feature_module and target_layer_names to constructor of the grad_cam class for AlexNet and for GoogleNet.
I realized I need to get the final layer from the Pretrain Network by using the following code.