Hi,
this should be a quick one, but I wasn’t able to figure it out myself.
When I use a pre-defined module in PyTorch, I can typically access its weights fairly easily.
However, how do I access them if I wrapped the module in nn.Sequential() first?
Please see toy example below.
class My_Model_1(nn.Module):
def __init__(self,D_in,D_out):
super(My_Model_1, self).__init__()
self.layer = nn.Linear(D_in,D_out)
def forward(self,x):
out = self.layer(x)
return out
class My_Model_2(nn.Module):
def __init__(self,D_in,D_out):
super(My_Model_2, self).__init__()
self.layer = nn.Sequential(nn.Linear(D_in,D_out))
def forward(self,x):
out = self.layer(x)
return out
model_1 = My_Model_1(10,10)
print(model_1.layer.weight)
model_2 = My_Model_2(10,10)
# How do I print the weights now?
# model_2.layer.0.weight doesn't work.
Many thanks.
Any help much appreciated.