torch.nn.Dropout(p=0.5, inplace=False)

Dropout is scaled by 1/p to keep the expected inputs equal during training and testing.
Have a look at this post for more information.

Where did you notice the 1/(1-p) scaling?