to get a 1D tensor of all the trainable parameters of a given model (and corresponding gradients).
Is it possible to do something like that in PyTorch so that cnn_params shares the same memory of the corresponding model? I should mention that I only care about the trainable parameters (i.e., weights and biases) and not their gradients.
Thanks for your answer ptrblck. My idea, if possible, would be to manipulate a 1D tensor with all the parameters so that I can avoid looping through the list and perform the same operation several times (e.g., replacing parameters above a certain value in each individual module) in order to speed up the algorithm.
In case anyone else is interested, the closest solution I was able to find involves using torch.nn.utils.parameters_to_vector() to get the parameters vector and then calling torch.nn.utils.vector_to_parameters() when Iām done modifying it.