I have a problem about the how to fit my data into a LSTM model.
The shape of the training features is: torch.Size([1899, 14, 30491]) and the shape of training labels is: torch.Size([1899, 30490]). 14 is the window size and 30491 is the 30490 products in a shop plus 1extra 0/1 feature. 1899 is the time series that 1899 days.
I know keras this can be simply written as
regressor = Sequential() # Adding the first LSTM layer and some Dropout regularisation layer_1_units=40 regressor.add(LSTM(units = layer_1_units, return_sequences = True, input_shape = (X_train.shape, X_train.shape)))
The X_train.shape is 14, the X_train.shape is 30491.
My model in pytorch is:
class M5_predictor(nn.Module): def __init__(self, in_features, out_features, n_hidden, n_layers, dropout): super(M5_predictor, self).__init__() self.n_layers = n_layers self.n_hidden = n_hidden self.linear = nn.Linear(n_hidden, out_features) #self.drop = dropout #self.sigmoid = nn.Sigmoid() self.rnn1 = nn.LSTM(input_size=in_features, hidden_size=n_hidden #num_layers=n_layers, ) def forward(self, in_data): #print(in_data.view(in_data.shape, in_data.shape, in_data.shape).shape) batch_size, seq_len, in_features = in_data.size() rnn_out, hidden = self.rnn1(in_data) output = self.linear(rnn_out) return output
n_hidden = 40 n_layers = 1 dropout = 0.2 lr = 0.001 in_features = 30491 out_features = 1 # build model criterion = nn.MSELoss() model = M5_predictor(in_features, out_features, n_hidden, n_layers, dropout).to(device) optimizer = Adam(model.parameters(), lr=lr)
How can I modify my code on pytorch that let the data fit the model and give outputs with shape (batch_size, 30490)