Is_leaf behavior for weight_norm'ed modules and implications for deepcopy

Using Pytorch 0.4.1

I noticed that immediately after initializing a module with weight_norm, the value of module.weight.is_leaf is False. After executing the module within a torch.no_grad() statement, the value changes to True. Is this expected?

When the value is False, attempts to copy the module with copy.deepcopy fail with the error message: “Only tensors created explicitly by the user (graph leaves) support deepcopy at the moment”.

I noticed because I happen to want to be able to load a model and copy it before running any inference calls. It’s not a big deal as I can work around it, but it seemed like a potentially unintended behavior.

The same error happens in Pytorch version 1.6.0.