Gradient of loss w.r.t. a row of model input

Hi There,

Could you please help how to calculate gradient of loss w.r.t. a row of model input.


Could you clarify what you mean by row of model input. Are you talking about a subtensor of a parameter, a row of your training data? Do you mean to say that normally your model accepts a batch of inputs, but you’d like to compute the gradients wrt to the parameters, but using a single sample instead?