I have a model trained for four epochs and I saved this model after each epoch. Is there is a way to calculate the gradient as the difference between two models I mean between the saved model after the fourth epoch and the model saved after the third epoch.
I’m unsure what exactly your use case is, but in case you want to calculate any difference between parameters you could directly iterate these as seen here:
modelA = models.resnet18()
modelB = models.resnet18()
diffs = {}
for (name, paramA), (paramB) in zip(modelA.named_parameters(), modelB.parameters()):
diffs[name] = (paramA - paramB).abs()
@ptrblck Thanks for your reply, what I want to do it to calculate the gradient as the difference between the two model weights i.e.
gradient = (W_n ) - (W_n-1), where n is the number of epochs.
My code shows how to calculate the difference between the parameters of two different models. In your case both models would have been trained for some epochs while I just use randomly initialized models as an example.