Fox example,if I have a layer:
nn.Sequential(
nn.Linear(1,1),
nn.ReLU(),
nn.Linear(1,1),
nn.ReLU(),
nn.Linear(1,1),
)
My question is how can i got a output form the second nn.Linear(1,1) from this Sequential
Fox example,if I have a layer:
nn.Sequential(
nn.Linear(1,1),
nn.ReLU(),
nn.Linear(1,1),
nn.ReLU(),
nn.Linear(1,1),
)
My question is how can i got a output form the second nn.Linear(1,1) from this Sequential
oh I got it.I can use .children()
@mshmoon, how do you do that more specifically? I’m dealing with the same problem, so your advice’d be appreciated. Thanks a lot in advance!
What kind of issue are you facing?
You could do something like below:
seq = nn.Sequential(nn.Linear(5,5), nn.Dropout(0.2), nn.Linear(5,1))
x = torch.rand(5,5)
for layer in seq:
x = layer(x)
print(x)
Hi @InnovArul,
The problem I was concerned with was getting output of the hidden layers of AlexNet. It turned out that @fmassa has already provided a simple solution:
Thanks your question & solution page helped me to find solution
And the code below works for me
> fnn.model
Sequential(
(0): Linear(in_features=4, out_features=3, bias=True)
(1): ReLU()
(2): Linear(in_features=3, out_features=3, bias=True)
(3): LayerNorm()
(4): ReLU()
(5): Softmax()
)
> fnn.model[:-1](data_torch_variable))
i.e.
fnn.model[:-1](pre.trans_to_variable(data_x))[0:3]
tensor([[0.0000, 0.5921, 0.4040],
[0.0000, 0.5703, 0.4293],
[0.0000, 0.5618, 0.4390]], grad_fn=<SliceBackward>)
How does .children()
work?