optim.Adam error

i get

File :1
def create_fn(dataclass_type_nx-loopback, dataclass_HAS_DEFAULT_FACTORY, dataclass_builtins_object, dataclass_return_type):
^
SyntaxError: invalid syntax
when use any optimizer such that
optimizer = optim.Adam(params = params, lr = 0.001)

Name: dataclasses
Version: 0.6
Summary: A backport of the dataclasses module for Python 3.6
Home-page: GitHub - ericvsmith/dataclasses
Author: Eric V. Smith
Author-email: eric@python.org
License: Apache
Location: D:\anaconda3\envs\pytorch\Lib\site-packages
Requires:

Which dataclasses version are you using? Does updating it (in case you are using an older one) help?

Name: dataclasses
Version: 0.6
Summary: A backport of the dataclasses module for Python 3.6
Home-page: GitHub - ericvsmith/dataclasses
Author: Eric V. Smith
Author-email: eric@python.org
License: Apache
Location: D:\anaconda3\envs\pytorch\Lib\site-packages
Requires:

I’m using:

dataclasses                   0.6
dataclasses-json              0.6.7

and cannot reproduce any issues:

import torch
import torch.nn as nn 

params = [nn.Parameter(torch.randn(1))]
optimizer = torch.optim.Adam(params = params, lr = 0.001)

Does this code already fail in your environment?