I would like to look at my decoder weight matrix ( as a heatmap). I am a bit confused about the shapes, shouls I user the 5th matrix? ( the size can be seen in the model.parameters().
You could get more information, if you print the name and parameter using model.named_parameters().
Most likely the first and 5th outputs are the weight matrices from the linear layers.
The other (1dim parameters) would be the bias of the linear layers, weight and bias of the batch norm layers.
No, ReLU does not contain any parameters. They correspond to the nn.Batchnorm layer in the encoder.
The index in encoder.model.1 is corresponding to the ModuleList index when you print the model.