I’m new to PyTorch and I have been experimenting with a number of things trying to get my bearings with LSTMs. I want to create a pipeline that goes LSTM → linear → LSTM → linear, but I’m getting stuck transitioning from linear back to LSTM again. The shape of my states does not seem to be correct. Here’s what I have to initialize them:
self.lstm = nn.LSTM(input_size=input_size, hidden_size=hidden_size, num_layers=num_layers, batch_first=True,dropout=dropout)
self.fully_connected = nn.Linear(hidden_size, output_size)
nn.init.xavier_uniform_(self.fully_connected.weight)
self.lstm2 = nn.LSTM(input_size=1, hidden_size=hidden_size, num_layers=num_layers, batch_first=True,dropout=dropout)
self.fully_connected2 = nn.Linear(hidden_size, output_size)
Then, in my forward method, I have this:
hidden_initial = torch.zeros(self.num_layers, x.size(0), self.hidden_size)
cell_initial = torch.zeros(self.num_layers, x.size(0), self.hidden_size)
out, states = self.lstm(x, (hidden_initial,cell_initial))
h, c = states
out = out[:, -1, :]
out = self.fully_connected(out)
#Issue happens here
out, _ = self.lstm2(out, (h, c))
It tells me that it won’t take a (3D, 3D) input. However, I’m not sure what can be done to reshape this into the correct format. I’ve tried various slicing, viewing, and reshaping but have not managed yet.
I’d love if anyone could offer advice.