Understanding the how optimizer works

I am writing a customized optimizer. I took the SGD code and customized it, however, I get very different results when I uncomment the commented line. Can some one help me to understand what is going on.

d_p[torch.abs(d_p)<self.fix_update]=0.0
d_p[d_p<=-1*self.fix_update]=-1.0
d_p[d_p>=1*self.fix_update]=1.0

#self.current_batch[p.myid].zero_()
self.current_batch[p.myid]=d_p