Backpropagation gets slower and slower

I am timing my backprop loop between epochs, and every epoch it gets slower and slower for the same sized minibatch. How can I debug this? I already ran a casual sweep over the code to check for tensors that are set outside of the training loop. I assume this is happening because the graph is not getting rebuilt from scratch each backprop loop. What is the rigorous way to debug this?

1 Like