I try to understand the structure of a network but got confused. I want to know what determines the structure of a network, the **init** function or the forward()?

In the tutorial, I saw a network can be defined as

```
class DynamicNet(torch.nn.Module):
def __init__(self, D_in, H, D_out):
super(DynamicNet, self).__init__()
self.input_linear = torch.nn.Linear(D_in, H)
self.middle_linear = torch.nn.Linear(H, H)
self.output_linear = torch.nn.Linear(H, D_out)
def forward(self, x):
h_relu = self.input_linear(x).clamp(min=0)
for _ in range(2):
h_relu = self.middle_linear(h_relu).clamp(min=0)
y_pred = self.output_linear(h_relu)
return y_pred
```

In the DynamicNet, since the middle linear was used 3 times, so I guess there are 4 hidden layers but do they have the same weight?

And what if I define a network like this:

```
classNet(torch.nn.Module):
def __init__(self, D_in, H, D_out):
super(DynamicNet, self).__init__()
self.input_linear = torch.nn.Linear(D_in, H)
self.middle_linear = torch.nn.Linear(H, H)
self.extra_linear = torch.nn.Linear(H, H)
self.output_linear = torch.nn.Linear(H, D_out)
def forward(self, x):
h_relu = self.input_linear(x).clamp(min=0)
h_relu = self.middle_linear(h_relu).clamp(min=0)
y_pred = self.output_linear(h_relu)
return y_pred
```

I have a module extra_linear in the **init** but it is not used in forward. Will it have any effect on the network? Will the parameter of that layer be updated during back propagation?

In a summary, I feel the structure of the network is defined in forward, but when I print the network, it is what defined in **init** shows up. What is the relationship between **init** and forward?

I hope I expressed myself clearlyâ€¦

Thank you!