Freezing inner layers of VAEs

Sorry for I can not completely understand the structure of your network, but I think your main question is how to freeze part of the network with conv layers and BN layers.

You can achieve it by:

  1. Set the decoder to eval mode:
    decoder.eval()
    
    This is used to freeze BN layers (and dropout). In BN layers, besides parameters, there are buffers which are not optimized by the optimizer but updated automatically during forwarding in training mode. Please see explanation at How to properly fix batchnorm layers.
  2. Exclude decoder parameters from the optimizer.
  3. (Optional) set: requires_grad=false. I think this is mainly to speed up training and save memory. If not, please tell me.