Hi, how to use the grad of the first epoch in the next epoch
optimizer.zero_grad()
feat_s,train_pred = model(train_data)
feat_s[0].retain_grad()
grad = feat_s[0].grad
if grad is not None:
print (grad)
loss.backward()
optimizer.step()