Extracting weights from one class from softmax layer

I have trained a neural network for binary image classification and want to retrieve the weights of one of the classes for training it with another network.
Is it possible? If so, how can I do so?

A softmax layer doesn’t use any parameters, but just applies an operation to normalize the logits to probabilities.
I’m not sure I understand the question correctly.
Would you like to somehow extract all weights, which are responsible to predict a specific class from all layers?

I would like to extract the weights learned for a particular class to use them for further use

Would you like to get the weigths for a particular class from the last linear layer?
If so, you could use:

in_features = 32
nb_classes = 10
lin = nn.Linear(in_features, nb_classes)
# print weight and bias for class0
print(lin.weight[0], lin.bias[0])

There wouldn’t be an easy way to extract “all parameters” for a specific class, since the preceding layers would be completely used (in case of linear layers) for all classes.