Sync parameter server implementation

As I understood, the Tutorial for Parameter server based on the RPC framework is a special implementation based on different assumptions.
1- The data should be sent to the parameter server (for a large dataset is it possible???)
2- it Async
3- distributed auto grad has been used
am I correct?

I am wondering is there any way to implement a sync parameter server with the RPC framework without moving data to the parameter server? I mean just use local grad and for example, model averaging on parameter server. any exist?

Hey @sakh251

1- The data should be sent to the parameter server (for a large dataset is it possible???)
2- it Async
3- distributed auto grad has been used
am I correct?

Yep, you are correct.

I am wondering is there any way to implement a sync parameter server with the RPC framework without moving data to the parameter server? I mean just use local grad and for example, model averaging on parameter server. any exist?

Yep, this is possible. Check out this tutorial. This is not exactly what you are looking for, as the trainers send gradients to the parameter server and the parameter server batch updates the model. But it shouldn’t be too far from your requirements. Editing the batch-processing batch_update_size and then send model params instead of gradients might do the job.

1 Like

Thanks a lot, It is what I was looking.