Loss at step between epochs

I’m training a LSTM model but when I plot the loss and accuracy they have this recurring pattern and I don’t know how to solve it.

The way I save the value is by:

train_losses.append((train_running_loss*x.shape[0])/len(data))
train_acc.append(train_running_acc/((i+1)*batch_size*window_size))

And I set the running loss/acc at 0 at the start of each epoch.
But, this is what I got

The charts appear to describe the LSTM accruing loss between time steps in the series, and giving a more accurate prediction after seeing more of the time series(higher accuracy). And it appears like you’re starting again from the first time step after about 6,000 steps.