Hello guys,
I’m trying to fine-tune the Bert model i.e., bert-base-uncased for a text classification task. I’m getting a strange error while calling the backward() on loss, which is as follows
torch.nn.modules.module.ModuleAttributeError: ‘BCEWithLogitsLoss’ object has no attribute ‘backward’.
I can’t find any syntax error and also checked the inputs(outputs and targets) to the loss function i.e, nn.BCEWithLogitsLoss(outputs, targets) and I found then to be in the correct format. I don’t know what is causing this error. I would be great if any of you could help me with this…thanks in advance
Here is the code
def train_loop_fn(data_loader, model, optimizer, device, scheduler = None):
model.train()
for bi, d in enumerate(data_loader):
ids = d[“ids”]
mask = d[“mask”]
token_type_ids = d[“token_type_ids”]
targets = d[“targets”]
ids = ids.to(device, dtype=torch.long)
mask = mask.to(device, dtype=torch.long)
token_type_ids = token_type_ids.to(device, dtype=torch.long)
targets = targets.to(device, dtype=torch.float)
optimizer.zero_grad()
outputs = model(ids=ids, mask = mask, token_type_ids = token_type_ids)
outputs = outputs.reshape(1)
print("here is the outputs")
print(outputs)
print("here is the targets")
print(targets)
loss = nn.BCEWithLogitsLoss(outputs, targets)
print("this is the loss")
print(loss)
loss.backward()
optimizer.step()
if scheduler is not None:
scheduler.step()