RuntimeError: lhs of assignment must be a variable, subscript, or starred expression:

after i QAT my model, i tried to save it with
but it did not work:


Would you have a small code snippet that reproduces the crash please?

if args.qat:
configure(args.log_dir, flush_secs=5)
model.qconfig = torch.quantization.get_default_qat_qconfig(‘fbgemm’)
torch.quantization.prepare_qat(model, inplace=True)
# print(qat_model.parameters())

optimizer = torch.optim.SGD(model.parameters(),
best_loss = 100
for nepoch in range(args.epochs):
    train_one_epoch(nepoch, model, criterion, optimizer, data_loader, device)
    if nepoch > 90:
        # Freeze quantizer parameters
    if nepoch > 95:
        # Freeze batch norm mean and variance estimates

    # Check the accuracy after each epoch'cpu')
    quantized_model = torch.quantization.convert(model.eval(), inplace=False)
    size = get_size_of_model(quantized_model)
    top1, top5, loss = evaluate(quantized_model, data_loader_test)
    if loss < best_loss:
        best_loss = loss
        #, 'quantizated_best.pth'), 'quantizated_best.pkl')
        f'Epoch {nepoch} | top1: {top1} | top5: {top5} | loss {loss} | size {size}')

    log_value('Validating/Accuracy', top1, nepoch)
    log_value('Validating/Loss', loss, nepoch), 'quantizated_final.pth')

everything is ok except the process of ‘’

I already solve this problem, thank you !!!

there is empty nn.Squentional() in my module, which torch.quantized.convert can`t remove (observer) from it,

so I remove it manually, and it works


btw, we have a fix for empty sequential as well:

1 Like