Suppose I need to add a l2 regularization term to my loss, and its calculated like below

```
regularizer = torch.tensor(0.)
for name, param in model.named_parameters():
if 'weight' in name:
regularizer += torch.sum(torch.norm(param, dim=0))
```

I have two questions

one is that, for the initilization of ‘regularizer’, do I need to set the ‘requires_grad=True’

```
regularizer = torch.tensor(0., requires_grad=True)
```

Second one is that, is the inplace addition appropriate here? Do I need to modify it to

```
regularizer = regularizer + torch.sum(torch.norm(param, dim=0))
```