Hi, I want to calculate the projection of gradient(conv gradient) on the direction of weight(conv weight),

does pytorch implement it (tensor projection)? or can someone share some information(web, tutorial) about it ?

Thanks

Hi, I want to calculate the projection of gradient(conv gradient) on the direction of weight(conv weight),

does pytorch implement it (tensor projection)? or can someone share some information(web, tutorial) about it ?

Thanks

I think some more context and precision would be neat.

These two things that are close to your description and that come to my mind:

- “conv gradient” could be the gradient of the loss w.r.t. the output of the convolution and the task is to compute the gradient of the loss w.r.t. the weight. So you would take one step in backpropagation / compute the product of the “conv gradient” with the Jacobian. (Not strictly a projection, but somewhat close.)
- “conv gradient” could be the gradient of the loss w.r.t. the weight and the task is to project this onto the weight (so essentially you compute the gradient if the weight were restricted to the line through the origin and the current value). This is a projection in the conventional sense. To compute, you would compute
`(weight * grad).sum() / weight.norm()**2 * weight`

or so.

Best regards

Thomas