Error: No activations detected in Lightning module with Opacus

I’ve followed the MNIST Lightning example in the examples directory of the Opacus repo.

But I’m implementing a module that uses DistilBERT for an NLP classification task, instead of my own custom network.

Inside my module’s init(), I define:

self.model = DistilBertForSequenceClassification.from_pretrained('distilbert-base-uncased', num_labels=num_labels)

My forward() simple does the following:

def forward(self, input_ids, attention_mask, labels=None):
    return self.model(input_ids=input_ids, attention_mask=attention_mask, labels=labels)

As in the example, I defined make_private in the configure_optimizers() function of my module.

The only difference is that to obtain the data loader inside configure_optimizers(), I follow the updated way of doing this with Lightning:

if self.enable_dp:
  self.trainer.fit_loop.setup_data()
  data_loader = self.trainer.train_dataloader

When I run the Lightning trainer’s fit() method:

trainer.fit(model, train_loader, val_loader)

I get the following error:

/usr/local/lib/python3.10/dist-packages/opacus/grad_sample/grad_sample_module.py in rearrange_grad_samples(self, module, backprops, loss_reduction, batch_first)
    381         """
    382         if not hasattr(module, "activations"):
--> 383             raise ValueError(
    384                 f"No activations detected for {type(module)},"
    385                 " run forward after add_hooks(model)"

ValueError: No activations detected for <class 'torch.nn.modules.linear.Linear'>, run forward after add_hooks(model)

My Lightning module works fine without Opacus, so I’m not sure what I’m doing wrong.

Thanks in advance.