Relu , dropout and batch normalization

Hi,I use the relu and dropout in the training process.
After the training,I use the trained network in the validation process,should the relu and dropout still in the network structure?

Can anyone give me some suggestions?

Thank you so much.

The dropout will be closed during testing, while relu function does not change.
You can use model.train() and model.eval() to define the training and testing.

Thank you for your reply.

What about the validation process? should I use model.eval()? or just turn off the relu and dropout layer?

Yes. The valiadation is same as testing.