Hi! guys! I have a question about weight_norm.
My PyTorch version is 1.0.1 in all environments. By the way, I have set the same seeds. But when I use weight_norm in python2.7 and python3.7 respectively, I got different weight_g but the same weight_v.
Moreover, I got the same weight_g when I init the linear layer with a lower dimension. I am wondering what happened when I changed the initial dimension.
Hope someone could give a detailed analysis! Thanks!
For example:
Init with lower dimension
# python 3.7
weight_norm(nn.Linear(20, 20), dim=None).weight_g
"Parameter containing: tensor(2.6137, requires_grad=True)"
# python 2.7
weight_norm(nn.Linear(20, 20), dim=None).weight_g
"Parameter containing: tensor(2.6137, requires_grad=True)"
Init with higer dimension
# python 3.7
weight_norm(nn.Linear(2048, 2048), dim=None).weight_g
"tensor(295.5971, requires_grad=True)"
# python 2.7
weight_norm(nn.Linear(2048, 2048), dim=None).weight_g
"tensor(26.1274, requires_grad=True)"