Federated Learning. SGD on server

Hello! I am working on a Federated Learning scenario. I first train a global model on the server. I send copies of this model to several clients, and each of them trains its own data on the model. By means of SGD, each client locally updates the weights of the model and sends them back to the server. At the server, I compute a loss that is basically the difference between the initial global weights and the local weights sent back by the clients. I would like to use SGD on the server too to update the global weights based on this loss. However, I cannot do loss.backward(), since there is no previous forward() based on these global weights. Any idea on how to proceed with this scenario?

Hi, I have the same problem did you manage to fix it or find a solution?