BatchNorm and ReLU

@tom, @ptrblck, Thank you for your time.
As per discussion what I understand is that square root of zeros in in backward pass of std is causing problem. I am wondering then why Conv1d → BatchNorm1d → ReLU for TDNN is not causing any problem.

If problem is in computation of std, can adding eps(1e-6) during computation of torch.std is correct and possible to handle this?? I mean something like this

CodeCogsEqn

I might be wrong as I am not too good in maths and specially when it comes to implementation. :slightly_smiling_face: :slightly_smiling_face:

You would have to do this manually, incurring a performance penalty. If you make it a torch.jit.script function, the JIT will fuse it, though.