## First Approach
def train1(epoch):
model.train()
for idx, data in enumerate(train_dataloader):
....
for i in range(1,5):
train1(i)
## Second approach
def train2():
for idx, data in enumerate(train_dataloader):
....
model.train()
for i in range(1,5):
train2()
Can calling model.train() in every epoch actually reinitialize the model weights? I am seeing no difference in model weights of epoch 1 Vs epoch n using first approach. Can this be the issue or the issue is something else?