Loss Fails to Update (Sometimes)

I’ve finally tracked down the problem thanks to a similar issue with non-updating gradients found here: Loss does not change and weights remain zero. I was passing my final output through a ReLU before passing it through the log_softmax. Figured I’d post this update in case anyone comes across a similar issue in the future.