RNN module weights are not part of single contiguous chunk of memory

Thanks for your suggestion, worked perfectly. BTW, could you explain why? THX.

Thanks guys,
added self.rnn.flatten_parameters() into forward before calling rnn, and worked.

btw, putting the piece in __init__() does not work.

1 Like