I use Dropout layer in my LSTM.
If I fix the number for torch.manual_seed(number),
the test accuracy won’t change, but varying torch.manual_seed(number) can
get the different accuracies.
Can I choose a best model by adjusting torch.manual_seed(number)?
It has been proven that a good pre-initialization can improve the perfomance of Neural Network.
But I don’t think it would be a good way to achive it by changing seed simply.
You may need this. All you need is a good init (Mishkin et al) https://arxiv.org/abs/1511.06422