Self.init_weights() with Dynamic STD

Hi,

I want to run my NN with different standard deviation to see what is the best value to have the best performance. I have a loop to pass different values for STD to my network. When I try to pass the value I get the error: TypeError: init_weights() missing 1 required positional argument: ‘STD’

I simplyfied my code here:

A. instantiate the network and pass the STD value:
Mynet = Net_simple(STD = 0.01)

B. My net work
class Net_simple(torch.nn.Module):
def init(self, STD):
super(Net_simple, self).init()

    self.nn = torch.nn.Sequential(torch.nn.Linear(100, 1),torch.nn.LeakyReLU())        
    self.float()
    self.apply(self.init_weights(STD))

def forward(self, x):
    x = x
    o1 = self.nn(x)
    return o1

def init_weights(self, m,STD):
    if isinstance(m, torch.nn.Linear):
        print('initiating weight..'+  m.__class__.__name__)
        torch.nn.init.normal(m.weight, STD)
        m.bias.data.fill_(0.01)

Blockquote