When I have to stop learning

I have basic question about my network. I have four parameters during trenning process: Train Loss, Acc and Validation Loss, Acc. And I have heard about the overtraining, when i set too much train epoch. When I have to stop learning process ? Meyby if I have validation data overtrain is impossible ?
Does anyone have intresning docs about it ?
Thanks for help.

Plot your val_accuracy and see after how many epochs it starts dropping, that is where you should cut the training off, alternatively, train_loss will start becoming very less when your model starts overfitting

Thanks for your answer, but in my solution, I use pretrain resnet 50, and I have Validation Acc. equal ~1 after 3-4 epoch, (In my datasheet i have 6 class of 1500 picture each). So i must continue trenning process untill validation acc will start drop?

You may start by setting epochs to 10, seeing the val_acc then start decreasing it…
If your val_acc at higher than lets say epoch 5 than at epoch 10 then at epoch 10 it is overfitting… see after how many epochs it falls…