Thanks very much @tom very helpful!
Just for others that might have similar questions.
When you use:
for n, m in net.named_modules():
print(n, '\n', m)
you’ll get something along the lines of (just one block of the net):
b5.b3
Sequential(
(0): Conv2d(832, 48, kernel_size=(1, 1), stride=(1, 1))
(1): BatchNorm2d(48, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(2): ReLU(inplace)
(3): Conv2d(48, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(4): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(5): ReLU(inplace)
(6): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(7): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(8): ReLU(inplace)
)
then you’ll know exactly how to access the selected layer by indexing it as follows.
print(net.b5.b3[2]) #for activated layer
Then you can adjust callback from previous post using such way of indexing.
Once again thanks so much for simplifying this.
Isn’t that just genius! I think it is