How could I update attributes of a RRef?

I am studying the tutorial on parameter server parameter server. I would like to update the grad of parameters during training (i.e. between `dist_autograd.backward(cid, [loss])` and `opt.step(cid)`). How could I achieve this?

Hey @Kunlin_Yang, say, `param_rref` is the RRef of the corresponding parameter, you can do sth like the following:

``````import torch.distributed.autograd as dist_autograd
...

# do inplace update on the grad

# following is a toy training step