Hello all, I am trying to use the learning rate adaptions, and I got the error:
value cannot be converted to type float without overflow: inf
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.0) scheduler = ReduceLROnPlateau(optimizer, 'min') for epoch in range(10): for i, batch in enumerate(dataloader): output = model(batch) loss = criteria(output) scheduler.step(loss)
I wonder why does this happen and how to solve it??
Also, in the case that no lr scheduler but the momentum = 0.9, the run time error occurred.