How to save weights from training and then test

  for i, (x,y) in enumerate(zip(feature_trainloader,label_trainloader), 0):
                optimizer.zero_grad()   
                output = self.forward(x)
                target = y
                loss = self.CrossEntropyLoss(output, target)
                print(loss)
                loss.backward()
                optimizer.step()            
  print('Finished Training')

I want to now calculate the accuracy by putting feature_testloader through the input layer and calculating accuracy by comparing to label_testloader. How can I do that because when the for loop ends and I call forward again the weights will be reset?

The weights should not be reset. You should just be able to do another for loop for your test loaders. The model weights will be saved in memory.