Does Calling model.train() inside a function reinitialize the model weights each time

## First Approach
def train1(epoch):
    model.train()
    for idx, data in enumerate(train_dataloader):
       ....
for i in range(1,5):
    train1(i)
## Second approach
def train2():
    for idx, data in enumerate(train_dataloader):
       ....
model.train()
for i in range(1,5):
    train2()

Can calling model.train() in every epoch actually reinitialize the model weights? I am seeing no difference in model weights of epoch 1 Vs epoch n using first approach. Can this be the issue or the issue is something else?

model.train() just tells the model it is learning things from the data. It has nothing to do with initializing weights or anything like that. It is not necessarily critical to training a model. Model.eval() is the opposite and that is used to tell the model nothing needs to be learnt and it also makes sure there isn’t an error passed with the batchnorm functions when passing a image with a batch_size of 1.