Accessing gradients with respect to weights

Hi. For a related project, I need to access the gradients with respect to weights. In particular, the weights which are computed as: (w = w - learning_rate * grad) can be accessed using layer.weights. However, at every layer, i need to access the grad that is used to update the weights, and not the weights. Is there any way to do this?

Hi,

After calling .backward(), your weights will have a .grad field that contain a Tensor with the gradients for this Tensor. So layer.weights.grad in your case.