Trouble pruning model weights on model with many submodules

I am trying to prune a pretrained huggingface model from the transformers library, but the code below is giving me an error.

checkpoint = "bert-base-uncased"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForSequenceClassification.from_pretrained(checkpoint)

for module in model.named_modules():
    prune.global_unstructured(module, pruning_method = prune.L1Unstructured, amount = 0.9)


ValueError: not enough values to unpack (expected 2, got 0)

Is there a way to prune all the model weights, accounting for the fact that most of the “layers” in the model are contained in larger modules with many other layers?