Loss is 1 but gradients are zero

I think when layer_and_weights becomes 0, by tracing your code, it seems your predicted value and target value would be the same. and gradient could be 0 because of that reason.
What do you think?

Yes you are right, but if that happens, then the loss itself will become zero which is not the case here.

Using regularization helped here to make sure layer_and_weights doesn’t become zero by default. And gradient started flowing again. So I guess this solves the problem for now.

Thanks