Is it good practice to assign

Here is a small sample of code:

import torch
import torch.nn as nn

def func(m: nn.Module):
    p_grad =
    return None

I am wondering, is it good to assign to a variable and use it later? The reason is m.weight.grad can also be None. In pytorch 1.13.0, mypy throws error: Item "None" of "Union[Tensor, None, Module]" has no attribute "data".

I use the data attribute to implement a custom optimizer.

No, it’s not because the usage of the .data attribute is deprecated and you shouldn’t use it and (as you can already see) the .grad attribute is not necessarily defined.
Add checks to the .grad usage and make sure it was created before trying to assign or use it.

1 Like