Even the parameters are the same, it doesn’t mean the inferences are the same.
For dropout, when train(True), it does dropout; when train(False) it doesn’t do dropout (identitical output).
And for batchnorm, train(True) uses batch mean and batch var; and train(False) use running mean and running var.