I finally found the problem!!
For the last set of convolutions, that is 128-> 64 -> 64 -> 1, the activation function should not be used!
The activation function causes the values to vanish!
I just removed the nn.ReLU() modules on top of these convolution layers and now everything works like a charm!
Saeed