optim.Adam error

I’m using:

dataclasses                   0.6
dataclasses-json              0.6.7

and cannot reproduce any issues:

import torch
import torch.nn as nn 

params = [nn.Parameter(torch.randn(1))]
optimizer = torch.optim.Adam(params = params, lr = 0.001)

Does this code already fail in your environment?