How to set a fixed steps after which weights are updated, expect updating after every batch

Image Size varies a lot. Hence kept the Batch size = 1.
Now i need to propogate back the loss and do the optimizer step after every BACK_STEP no. of batches ( batch_size = 1).

model.train()
BACK_STEP = 128
for batch_idx, (inputs,labels) in enumerate(train_loader):
    inputs = inputs.to(device)
    labels = labels.to(device)

    if (batch_idx)%BACK_STEP==0:
        optimizer.zero_grad()

    with torch.set_grad_enabled(True):
        outputs = model(inputs)
        _, preds = torch.max(outputs, 1)
        loss = criterion(outputs, labels)
        #losses +=loss
        #print(labels,outputs,preds,loss)
        loss.backward()
        if (batch_idx+1)%BACK_STEP==0:
            optimizer.step()

    # statistics
    running_loss += loss.item() * inputs.size(0)
    running_corrects += torch.sum(preds == labels.data)
    
scheduler.step()

Is it the correct way to do it?
Because I think loss needs to be added as well for all images passes and then propogated back.
Any changes you would suggest?

The code looks alright and you don’t necessarily need to accumulate the loss, since the gradients will be automatically accumulated.
Here is a detailed description for different approaches.