Is it possible that uncommenting the line
out_u, mid_u = self.net(Iun)
may affect the performance as the loss does not depend on this line?
In my understanding commenting / uncommenting this line should not affect the test accuracy based on the net. However, it does. What could be the reason?
self.net.train()
for batch_idx in range(n):
Il, lbls, Iun = self.getNextBatch()
# labeled
outputs, mid = self.net(Il)
loss = self.criterion(outputs, lbls)
self.optimizer.zero_grad()
loss.backward()
self.optimizer.step()
# unlabeled
out_u, mid_u = self.net(Iun)
acc = testModel()
However, the following two codes (CODE 1 and CODE 2) give me the same test classification performances (i.e., acc1 = acc2).
# CODE 1
self.net.train()
for batch_idx in range(n):
Il, lbls, Iun = self.getNextBatch()
# labeled
outputs, mid = self.net(Il)
loss = self.criterion(outputs, lbls)
self.optimizer.zero_grad()
loss.backward()
self.optimizer.step()
# unlabeled
# out_u, mid_u = self.net(Iun)
acc1 = testModel()
# CODE 2
for batch_idx in range(n):
Il, lbls, Iun = self.getNextBatch()
# labeled
self.net.train()
outputs, mid = self.net(Il)
loss = self.criterion(outputs, lbls)
self.optimizer.zero_grad()
loss.backward()
self.optimizer.step()
self.net.eval()
# unlabeled
out_u, mid_u = self.net(Iun)
acc2 = testModel()