Hello! I need to create recurrent NN, during the process of training I have all the x_i and y_i
but for test I have only x_1 and want to use y_1 as x_2, y_2 as x_3 and etc. How to do it via torch.nn.RNN?
Hello! I need to create recurrent NN, during the process of training I have all the x_i and y_i
but for test I have only x_1 and want to use y_1 as x_2, y_2 as x_3 and etc. How to do it via torch.nn.RNN?
Assuming your hidden dimension is the same size as your input dimension, you can do it like so:
rnn = torch.nn.RNN(16, 16)
x1 = torch.randn((1, 1, 16))
y1,h1 = rnn(x1)
y2,h2 = rnn(y1, h1)
...
The output of the RNN is a sequence of outputs produced and the last hidden state. The RNN layer can take in a sequence of inputs and a hidden state (default to all zeros for the first forward).
Can you explain, how should I train and use the net you suggested?
It’s hard to say without knowing the objective, but I would maybe do something like this:
rnn = torch.nn.RNN(16, 16)
x1 = torch.randn((1, 1, 16)) # Example input for one-to-many
seq_output_len = 5 # I want a sequence of five produced
yn = x1
hn = None
outputs = []
for _ in range(seq_output_len):
yn, hn = rnn(yn, hn) # Use previous output and hidden state
outputs.append(yn)
output = torch.cat(outputs) # 5 x 1 x 16
loss = do_something(output)
loss.backward()
This is a simplistic example which will vary on your use case
No, you did not understand, for training I already have all y_i and want to use them for training
For training, feed your entire sequence into your RNN and you will get a similar size sequence out. Compute a loss between that output and your target Y.
For testing, refer to the code I provided. output
will be what you compare against the ground truth.