Gradients blowing up when F.dropout is used with Conv1D

Posted git issue here: https://github.com/pytorch/pytorch/issues/18169

Gradients are blowing up in Pytorch 1.0.1 when using Conv1D --> PreLu --> dropout --> Conv1D. This didn’t happen in Pytorch 0.3.1 and noticed during refactor. I’m not sure if it’s a bug or I’m doing something wrong.