Why is my loss stuck after 2nd Epoch?

I am really not sure, why my loss is stuck after the second epoch.
Likewise, the RMS and R² score wont change after the second epoch.


Epoch 1 -- loss 17.611322, RMS error 0.149893, R² score -0.002587 
Epoch 2 -- loss 9.828691, RMS error 0.139825, R² score 0.140504 
Epoch 3 -- loss 9.810824, RMS error 0.139688, R² score 0.141909 
Epoch 4 -- loss 9.756177, RMS error 0.139007, R² score 0.150746 
Epoch 5 -- loss 9.712296, RMS error 0.138353, R² score 0.158283

The training loop:

for epoch in range(epochs):
    epoch_error = []
    epoch_loss = []
    epoch_r2 = []
    for i_batch, minibatch in enumerate(dataloader):
        inputs, outputs = minibatch
        optimizer.zero_grad()
        pred = model.forward(inputs)
        loss = target_loss(pred, outputs) + beta * model.kl_loss
        loss.backward()
        optimizer.step()
        error = torch.sqrt(torch.mean((torch.flatten(pred) - torch.flatten(outputs)) ** 2)).detach().numpy()
        r2 = r2_score(torch.flatten(outputs).detach().numpy(), torch.flatten(pred).detach().numpy())
        epoch_error.append(error)
        epoch_loss.append(loss.data.detach().numpy())
        epoch_r2.append(r2)
    hist_error.append(np.mean(epoch_error))
    hist_loss.append(np.mean(epoch_loss))
    hist_r2.append(np.mean(epoch_r2))
    print("Epoch %d -- loss %f, RMS error %f, R² score %f " % (epoch+1, hist_loss[-1], hist_error[-1], hist_r2[-1]))

Where target_loss is simply torch.mean(torch.sum((pred - out) ** 2))