Hi all,
For my use case, I need to manually calculate the gradient w.r.t to the parameters of my network and do backpropagation.
This is how I’m currently calculating the loss and doing backprogation on the parameters:
network_optimizer.zero_grad()
network_loss = criterion(data)
gradients = network.get_gradient() #Manual gradient calculation
for param in network.parameters():
param.grad = gradients
param.backward(gradients)
network_optimizer.step()
Given my gradient calculation is correct and both the gradients tensor and prams tensor are the same shape. Does the above correctly assign the gradients to the parameters and do backpropogation w.r.t the parameters?
Any insight would be greatly appreciated!