Custom weight initialization

You can still do that. I created an instance of the class Net:

>>> net = Net()
>>> net
Net(
  (conv1): Sequential(
    (0): Conv1d(4, 64, kernel_size=(3,), stride=(1,), padding=(1,))
    (1): ReLU()
    (2): Conv1d(64, 128, kernel_size=(3,), stride=(1,), padding=(1,))
    (3): ReLU()
    (4): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
  )
  (dense): Sequential(
    (0): Linear(in_features=640, out_features=320, bias=True)
    (1): ReLU()
    (2): Dropout(p=0.5)
    (3): Linear(in_features=320, out_features=10, bias=True)
  )
)

So, we can access the layers of this object by their names and their index:

>>> net.conv1[0]
Conv1d(4, 64, kernel_size=(3,), stride=(1,), padding=(1,))

So, we can assign the weights of each layer similarly:

net.conv1[0].weight.data = net.conv1[0].weight.data + K
3 Likes