Dear All,
I want to initialize FC layers in a customized way.
For example,
self.last_layer = nn.Linear(100, 10)
For the first 50 units’ weights, I want it to be all zeros, and the last 50 to be random.
Any idea how to implement this?
Thank you.
Dear All,
I want to initialize FC layers in a customized way.
For example,
self.last_layer = nn.Linear(100, 10)
For the first 50 units’ weights, I want it to be all zeros, and the last 50 to be random.
Any idea how to implement this?
Thank you.
You could create a custom nn.Parameter
filled with your desired values and assign it directly to the .weight
attribute of the layer in a with torch.no_grad()
context or you could also .copy_
the desired tensor to the .weight
in the no_grad
guard.