this my network

```
class network(nn.Module):
def __init__(self):
super(network,self).__init__()
self.fc1 =nn.Linear(5,4)
self.fc2 = nn.Linear(4,2)
def forward(self,data):
data = F.relu(self.fc1)
data = self.fc2(data)
return F.softmax(data)
```

LOOP1 prints all parameters for each layer as expected whereas LOOP2 seems to print the first row parameters of each child. I would like to access the weights of the first layer only for pruning. what is the easiest way to access the following tensor:

```
tensor([[-0.1513, -0.2254, -0.2822, -0.2793, 0.1676],
[ 0.3483, -0.3100, 0.3420, 0.4152, 0.1245],
[-0.0814, 0.0852, -0.0957, -0.2115, 0.1776],
[-0.3269, -0.0743, 0.0511, 0.4126, -0.3794]])
```

```
#loop1
for child in Net.children():
for param in child.parameters():
print(param)
print("")
print("")
#loop 2
for child in Net.children():
for name, param in child.named_parameters():
print(name ,param[0])
OUTPUT
Parameter containing:
tensor([[-0.1513, -0.2254, -0.2822, -0.2793, 0.1676],
[ 0.3483, -0.3100, 0.3420, 0.4152, 0.1245],
[-0.0814, 0.0852, -0.0957, -0.2115, 0.1776],
[-0.3269, -0.0743, 0.0511, 0.4126, -0.3794]])
Parameter containing:
tensor([ 0.0853, -0.2218, -0.4387, 0.2383])
Parameter containing:
tensor([[-0.1804, 0.2937, -0.4402, -0.2483],
[-0.3421, 0.0601, 0.1138, 0.4677]])
Parameter containing:
tensor([ 0.4512, -0.2957])
weight tensor([-0.1513, -0.2254, -0.2822, -0.2793, 0.1676])
bias tensor(1.00000e-02 *
8.5337)
weight tensor([-0.1804, 0.2937, -0.4402, -0.2483])
bias tensor(0.4512)
```