Sample partition technique in Pytorch RNN Tutorials

Hi guys,

I am studying Pytorch Tutorials and find them extremely helpful.

But I am confused when I read [https://pytorch.org/tutorials/intermediate/char_rnn_classification_tutorial.html#training] and [https://pytorch.org/tutorials/intermediate/char_rnn_generation_tutorial.html#training] where it seems training samples are randomly sampled from the whole data in each iteration, instead of having a dedicated/seperated training sample.

In Evaluation part, it seems test data is also randomly sampled from the whole data. Does that mean there’s never a clear partition of train/test data? All data is seen in training. It’s so against my usual understanding of train/test partition or is it my misunderstanding of the code?

Thanks so much for your help.