Is model in my main function updated when I call model.backward() inside another function?

Hi,

I create my model within my main function. Then I send the model to training function and update model there. In that case, should I return the model back from the training function to use the trained model inside my main function ?

Here is a small snippet to describe what I’ve just asked:

def main():
model = BertForSequenceClassification(‘bert-base-cased’)
tokenizer = BertTokenizer(‘bert-base-cased’)
data = get_some_data()
train(model,tokenizer,data)
#So at this point when I use model to make predictions, is this a trained model or
#was train model just lost since I did not return it form train funciton ? (i.e. model = train(model…))

def train(model,data):
    optimizer_grouped_parameters = [
        {'params': [p for n, p in model.named_parameters() if not any(nd in n for nd in no_decay)], 'weight_decay': args['weight_decay']},
        {'params': [p for n, p in model.named_parameters() if any(nd in n for nd in no_decay)], 'weight_decay': 0.0}
        ]

    optimizer = AdamW(optimizer_grouped_parameters,lr=some_lr)
    scheduler = WarmupLinearSchedule(optimizer, warmup_steps=args['warmup_steps'], t_total=t_total)
   model.train()
   for x,y in data:
      outputs = model(x)
      loss = outputs[0]
      loss.backward()

If I’m not wrong from a python perspective you don’t need to return it but I would consider it a good practice.
If you call train(model,data) what do you loose for doing model=train(model,data). It’s more clean, more concise and more readable. You neither need to pass model to the function as it is an outer variable but you still pass it. Furthermore that behavior depends on which object are you passing to the function, if you pass an integer it would behave the other way around.