How to initialize a quantized.Embedding?

The API doc says quantized.Embedding implements the same interface as nn.Embedding thus it should be easily replaceable.

I tried to use it but found that in my model, it previously use “init.uniform_” to initialize Embedding. And this init looks like not compatible with quantized version embedding. What is the recommended way to initialize quantized.embedding?

error message:

Traceback (most recent call last):
self.reset_parameters()
File “/data00/home/user/byted/test/poc/test_dw.py”, line 148, in reset_parameters
init.uniform_(self.node_embed.weight.data, -init_range, init_range)
AttributeError: ‘function’ object has no attribute ‘data’

Thank you!

Figured out by reading code from github, there is slight difference between two implementations, now I made the initialization work.

Previously my model works perfectly fine with nn.Embedding. Now when it runs with quantized.Embedding, I got this optimizor error. Any suggestions?

Traceback (most recent call last):
File “/data00/home/user/byted/poc/test_dw.py”, line 246, in
optimizer = torch.optim.SGD(model.parameters(), lr=0.01, momentum=0.9)
File “/home/user/anaconda3/envs/egs39/lib/python3.9/site-packages/torch/optim/sgd.py”, line 27, in init
super().init(params, defaults)
File “/home/user/anaconda3/envs/egs39/lib/python3.9/site-packages/torch/optim/optimizer.py”, line 273, in init
raise ValueError(“optimizer got an empty parameter list”)
ValueError: optimizer got an empty parameter list