hi,
code
train_loader = data.DataLoader(
train_loader,
batch_size=cfg["training"]["batch_size"],
num_workers=cfg["training"]["num_workers"],
shuffle=True,
)
while i <= cfg["training"]["train_iters"] and flag:
for idx,(images, labels) in enumerate(tbar):
i += 1
print('epoch {}--------------------------'.format(i))
print('iter {}---------------------------'.format(idx))
start_ts = time.time()
scheduler.step()
model.train()
images = images.to(device)
labels = labels.to(device)
optimizer.zero_grad()
outputs = model(images)
result
but I expect is printed epoch =1 , Why is the outer loop and the inner loop the same? Is it relevant to set collate_fn? thanks !!!
epoch 1--------------------------
iter 0---------------------------
Train loss: 1.4994: 0%|▌ | 1/225 [00:09<35:33, 9.53s/it]
epoch 1--------------------------
iter 1---------------------------
Train loss: 1.5003: 1%|█▏ | 2/225 [00:10<25:24, 6.84s/it]
epoch 1--------------------------
iter 2---------------------------
Train loss: 1.5007: 1%|█▋ | 3/225 [00:10<18:19, 4.95s/it]
epoch 1--------------------------
iter 3---------------------------
Train loss: 1.5000: 2%|██▎ | 4/225 [00:11<13:22, 3.63s/it]
epoch 1--------------------------
iter 4---------------------------
Train loss: 1.4983: 2%|██▊ | 5/225 [00:11<09:56, 2.71s/it]
epoch 1--------------------------
iter 5---------------------------
Train loss: 1.4978: 3%|███▍ | 6/225 [00:12<07:31, 2.06s/it]
epoch 1--------------------------
iter 6---------------------------
Train loss: 1.4979: 3%|███▉ | 7/225 [00:12<05:51, 1.61s/it]
epoch 1--------------------------
iter 7---------------------------
Train loss: 1.4962: 4%|████▌ | 8/225 [00:13<04:40, 1.29s/it]
epoch 1--------------------------
iter 8---------------------------
Train loss: 1.4960: 4%|█████ | 9/225 [00:13<03:52, 1.07s/it]
epoch 1--------------------------
iter 9---------------------------