Model training uses parameter sharing, but the saved size is the same as the model without sharing. How can we reduce the storage size of the model with shared weights?
Model training uses parameter sharing, but the saved size is the same as the model without sharing. How can we reduce the storage size of the model with shared weights?