Seems like there is a mistake in pytorch tutorials

I found that the example program given in RL tutorial has a mistake —there is no activation fuction such as Relu in the fully connect layer. So after iterations, the duration is getting worse…


after the last fully connected layer, you generally do not put ReLU.