How to disable the backprop of some layer?

Hello. I have a network that get a image variable as input. In the beginning of network, I need to resize the image to different sizes, therefore I use adaptiveavgpooling. However, as adaptiveavgpooling is a nn module, it should record some parameter for backprop and during backprop it will take some time to deal with these procedure, which in my settting is unnecessary, as I know the image variable is the beginning of the network and have no need to backprop further. So I want to ask how to disable the backprop of layer in this situation. Thanks!

1 Like

If you just need to resize your image to a specific or different specific sizes before feeding it to your network, you could do it in a Dataset or just before the training phase.
AdaptiveAvgPooling is useful, if you have to deal with different sized inputs, which have to be resized to a specific size inside the network (not before!).

Thanks for you reply. I got your point. However I need the image at 6 different resolution, which will increase the argument of net too much… Maybe packing the images in a dict is a way but when I have batch_size larger than one… I am not sure whether the packing procedure the dataset doing recursively will deal with dict… Therefore I think do the resize in net inside is another way…