After spectral_norm grad isNone

After usage spectral_norm after backward no grad:

import torch
import torch.nn as nn
import torch.nn.utils.spectral_norm as spectral_norm
m = nn.Conv2d(1,1,3,1,1)
m(torch.randn((1,1,3,3))).mean().backward()
print(m.weight,m.weight.grad)

(Parameter containing:
tensor([[[[ 0.1578, 0.1756, 0.3056],
[ 0.1704, -0.0901, -0.1729],
[ 0.0391, -0.2639, -0.1461]]]], requires_grad=True),
tensor([[[[-0.0684, 0.1441, -0.0647],
[-0.1048, 0.0695, -0.1498],
[-0.2293, -0.0627, -0.1013]]]]))

But when i appy spectral_norm: no grad
m = spectral_norm(nn.Conv2d(1,1,3,1,1))
m(torch.randn((1,1,3,3))).mean().backward()
print(m.weight,m.weight.grad)

(tensor([[[[ 0.2949, 0.3651, -0.1934],
[-0.5089, -0.3238, -0.2878],
[-0.3722, -0.1071, 0.3818]]]], grad_fn=), None)

SpectralNorm manipulated the weight parameter as seen here so you could access the gradient via m.weight_orig.grad.