Hi,
I have a model and i need to train it with a self calculated gradient instead of a loss function but how do i backpropagate this gradient? I have a model and optimizer:
model = Model()
optimizer = optim.Adam(model.parameters(), lr=1e-4)
now i can calculate the output and the gradient:
optimizer.zero_grad()
out = model(input)
gradient = my_custom_gradient_function(out)
And normally i use now loss.backward() and optimizer.step() when i have a loss function but how can i backpropagate now with the calculated gradient?