By default all the modules are initialized to train mode (self.training = True). Also be aware that some layers have different behavior during train/and evaluation (like BatchNorm, Dropout) so setting it matters.
Dropout works as a regularization for preventing overfitting during training.
It randomly zeros the elements of inputs in Dropout layer on forward call.
It should be disabled during testing ( model.eval() ) since you may want to use full model (no element is masked)