How to declare and train RNN/LSTM in skorch?

the skorch library is amazing. it’s really very easy to use but I’m having troubles implementing an LSTM model with it. Since it abstract the training loop and all, I don’t know what should be returned in the forward function of the nn model. My task is regression, the problem is I found many results on the internet for word embeddings and something like that, I don’t need that, my task is simpler than that I only want to do regression like predicting house prices for example. here is my LSTM declaration

 class LstmNetwork(nn.Module):

    def __init__(self, n_features, n_lstm_hidden=16, n_fc_hidden=64, n_out=2, num_layers=1, bi=False, dropout=0.0):
        super().__init__()

        self.n_features = n_features
        self.n_lstm_hidden = n_lstm_hidden
        self.n_fc_hidden = n_fc_hidden
        self.n_out = n_out
        self.num_layers = num_layers
        self.bi = bi
        self.dropout = dropout
        self.hidden_cell = self.init_hidden()

        self.lstm = nn.LSTM(
            input_size=self.n_features,
            hidden_size=self.n_lstm_hidden,
            num_layers=self.num_layers,
            bidirectional=self.bi,
            dropout=self.dropout)

        self.fc = nn.Linear(
            in_features=n_fc_hidden,
            out_features=n_out
        )

    def init_hidden(self):

        return (torch.zeros(1, 1, self.n_lstm_hidden),
                torch.zeros(1, 1, self.n_lstm_hidden))

    def forward(self, x):
        # I found this example on internet, but there is no initialisation for hidden states here, I tried to 
       # initialize hidden states as stated above but it throws an error when I run it on GPU
     # also I thought that we should also return the hidden states in the forward function in pytorch but 
    # if I do that here I get an Error.
    # I'm not sure if my declaration is correct, can someone help?
        out, hn = self.lstm(x)
        out = self.fc(out)
        return out

please help me out with this, I’m struggling for two weeks now