Weights copy code

I found this code

for m in self.modules():
            if isinstance(m, nn.Conv2d) or isinstance(m, nn.Linear):
                import scipy.stats as stats
                stddev = m.stddev if hasattr(m, 'stddev') else 0.1
                X = stats.truncnorm(-2, 2, scale=stddev)
                values = torch.Tensor(X.rvs(m.weight.data.numel()))
                values = values.view(m.weight.data.size())
                m.weight.data.copy_(values)
            elif isinstance(m, nn.BatchNorm2d):
                m.weight.data.fill_(1)
                m.bias.data.zero_()

Can anyone explain what is happening here after isinstance is matched with either nn.Conv2d ,nn.Linear and nn.BatchNorm2d

It looks like weight initialization code where the weights are being initialized to a distribution returned by truncnorm. Below is your above code with comments:

Get the standard dev value. Allows for per-layer customization and defaults to 0.1 if none specified
stddev = m.stddev if hasattr(m, ‘stddev’) else 0.1
Builds a handle for the distribution we want to sample from
X = stats.truncnorm(-2, 2, scale=stddev)
Produces a random array containing as many elements as my weight matrix
values = torch.Tensor(X.rvs(m.weight.data.numel()))
Reshapes the random matrix to have the same shape as the weight matrix
values = values.view(m.weight.data.size())
Copies it into the weight variable data
m.weight.data.copy_(values)

How would I do this initialize weight for each layer like this diagram using random values