How to manually update weights of conv2d layer in forward pass

Is there a way I can divide by l2_norm of conv2d layer (similar as ArcFace, InsightFace but for conv layer not for fully connect layer)?
My initial guess is i can add

with torch.no_grad(): = custom_weight_norm_v1(net.conv1.weight)

Where my custom_weight_norm_v1 func is:

def custom_weight_norm_v1(a):
    x1, x2, x3, x4 = a.shape
    c = a.view(x1, -1)
    c = c/ torch.norm(c, dim= 1, keepdim=True)
    c = c.view(x1, x2, x3, x4)
    return c

I tried no_grad in forward and training loop. Is it going to work?

Also One more question everytime I train i get different result(I used seed though)


@ptrblck Can you help me please? :’)