Combining Trained Models in PyTorch

Hi ptrblck,
I am new to pytorch (deep learning). This example of combining trained models is exactly what I am facing now. I need some help to correct my thought:
My thought/problem is: I have model A and model B, and I use MyEnsemble(model A, model B) as you described.
I defined: model = MyEnsemble(modelA, modelB).
Then, for parameters optimizer I used: optimizer = th.optim.Adam(model.parameters(), lr=lr, weight_decay=weight_decay)
model.train()
train_losses,val_losses,train_accs,val_accs = [],[],[],[]
early_stopping = EarlyStopping(patience=patience, verbose=True)
for epoch in range(n_epochs):
train_loader = DataLoader(traindata_list, batch_size=batchsize, shuffle=True, num_workers=5)
test_loader = DataLoader(testdata_list, batch_size=batchsize,
shuffle=True, num_workers=5)
avg_loss,avg_acc = [],[]
batch_idx = 0
tqdm_train_loader = tqdm(train_loader)
for Batch_data in tqdm_train_loader:
Batch_data.to(device)
out_labels = model(Batch_data)
loss = F.nll_loss(out_labels, Batch_data.y)
optimizer.zero_grad()
loss.backward()
avg_loss.append(loss.item())
optimizer.step()
_, pred = out_labels.max(dim=-1)
I am wondering that whether this is correct way to optimize the parameters.
I very appreciate for your help.
Thanks!