Weight update using gradient as a loss function

Assume an image classification task using a neural net where loss function is defined as L .Consider an image x ,while performing gradient descent on the loss, the gradient of L with respect to x is dL/dx . Now about my doubt. After this I want to calculate the gradient of norm of dL/dx wrt to weights of neural net. how do I do that?

Hi,

You can do the following:

x = torch.rand()
output = net(x)
L = crit(output, label)

dLdx = autograd.grad(L, x, create_graph=True)[0]

dLdx.backward()
# Now you have the grads as usual in the .grad fields

Thanks,this works perfectly