What is the difference between the following two methods, if the full connection layer is more? i want an example.Thank you

class RNN(nn.Module):

def __init__(self):
    super(RNN, self).__init__()
    self.rnn = nn.RNN(
        input_size=INPUT_SIZE,
        hidden_size=32, 
        num_layers=1,  
    )
    self.out = nn.Linear(32, 1)

def forward(self, x, h):
    
    out, h = self.rnn(x, h)
    prediction = self.out(out)
    return prediction, h


class RNN(nn.Module):
    def __init__(self):
    super(RNN, self).__init__()

    self.rnn = nn.RNN( 
        input_size=1,
        hidden_size=32,     
        num_layers=1,      
        batch_first=True,  
    )
    self.out = nn.Linear(32, 1)

def forward(self, x, h_state):  
    r_out, h_state = self.rnn(x, h_state)  
    outs = []   
    for time_step in range(r_out.size(1)):   
        outs.append(self.out(r_out[:, time_step, :]))
    return torch.stack(outs, dim=1), h_state

Hi, your code is formatted unwell, and there are several differences, e.g. batch_first=True. What exactly would you like to know?

(As a side: it’s customary in polite English to ask “I would like an example” instead of a harsh-sounding “I want an example”)

Sorry, my English expression is not very good, my question is the difference between using a loop and not using a loop in the forward calculation of two pieces of code, thank you.

Apparently the output of your 2 forward is different.

I want to know what’s the difference between these two calculations? thank you

For the first one, u r only applying the FC to the last hidden state. For the second one, u r applying FC to all hidden states.

For the first, using the last state should not be prediction = self.out(out[-1,:,:]), I don’t quite understand. For the second one, if I have two layers of the full connection layer using all the hidden states, what should I write? Please give me an example, thank you

U should take a look at RNN doc, that tells u the shape and meaning of the outputs of RNN. https://pytorch.org/docs/1.2.0/nn.html#recurrent-layers