Do evaluation without pausing the main training process

Usually, we train model and eval it in the same procedure.

...
for epoch_idx range(max_epoch):
    net.train()
    # do training with batchsize = N
    for train_img, label in train_loader:
        ...
        pred = net(train_img)
        loss = criterion(pred, label)
        optim.zero_grad()
        loss.backward()
        optim.step()
   # eval
   if epoch_idx % eval_step == 0:
       net.eval()
       # do evaluation with batchsize =1

However, for quite a lot tasks that are favor of cropped images with fixed size when performing training, yet when it comes to evaluation, each image varies with each other in size and so the batchsize has to be 1.

In my case, the evaluation time is about 10x longer than that of training, which makes the whole procedure slow. The model is large and I do not want to store it for each epoch.

I wonder whether it is practicable: create a new thread and perform the evaluation, so that the main thread continues trianing without suspending.

I am not familar with multi-processing, so any help will be appreciating.