Nested GRU layers

Hello,
I was using the model below with 1 GRU layer and it was working perfectly but once I increase the number of GRU layers it start giving errors like
RuntimeError: Expected hidden size (2, 10, 100), got (1, 10, 100)

Please advise :slight_smile:

class GRU(nn.Module):
    def __init__(self, input_size, hidden_size, output_size):
        super(GRU, self).__init__()
        num_layers = 2
        self.hidden_size = hidden_size
        self.gru = nn.GRU(input_size, hidden_size,num_layers,dropout=0.1)
        self.linear = nn.Linear(hidden_size, output_size)

    def forward(self, input, hidden):
        hx, hn = self.gru(input, hidden)
        rearranged = hn.view(hn.size()[1], hn.size(2))
        out1 = self.linear(rearranged)
        out2 = F.softmax(out1)
        return out2

because the hidden size should have the first dimension set to the number of layers. The ‘hidden’ state is the state thta you have labelled appropriately ‘hidden’, and that is passed in to the forward method.

Thanks, When I set the first dimension of the hidden size to the number of layers it gives me this as a warning at the beginning

Using a target size (torch.Size([10, 2])) that is different to the input size (torch.Size([2, 10, 2])) is deprecated. Please ensure they have the same size.
  "Please ensure they have the same size.".format(target.size(), input.size()))

and then this was the error
ValueError: Target and input must have the same number of elements. target nelement (20) != input nelement (40)

as I changed the rearranged variable in the forward to be
rearranged = hn.view(hn.size()[0], hn.size(1), hn.size(2))

what might be the solution? Thanks