Gradient between two different tensor

Hi,

can i caluclate the gradient between two tensor or is there gradient between two tensor?

Hi,

Could you clarify your question?
You can get the partial derivatives of multi-variate functions.

I mean i have two tensor. every tensor represent the gradient between two images (output image w.t.r to input image ) and i want to calulate the gradient between these tensors.

these tensors

Is quite vague here. You mean the input/output images? Some gradients?

Can you write a mathematical formula of what you want?

i have two tensor as output from cnn and i want to calculate the gradient between them.
is there agradient between two different tensors?

is there agradient between two different tensors?

Not really. You have the gradient of a function usually :slight_smile:

np.gradient(np.array([[1, 2, 6], [3, 4, 5]], dtype=np.float))

that is gradient between two arrays

This is a single numpy array.

Also this array is interpreted as containing multiple evaluation of a function at different values. And it computes the gradient for the function these values represent.
Is that what you want?

I want to convert the two tensors to two arrays and calculate the gradient between them

I am sorry what do you mean by gradient here?

  • You want finite difference approximation of the gradient? How do you read the tensors? Two different evaluations of the same functions? Or each Tensor corresponds to a single function?
  • You want gradients with autodiff like we usually do and one tensor was created in a differentiable manner from the other? And you want the gradient for the function that did the mapping from one to the other.

each Tensor corresponds to a single function. I want to calculate gradients and one tensor was created in a differentiable manner from the othe

Ho,

So it has nothing to do with np.gradient().

In that case if you create y = f(x). Then you can get the gradient of y wrt to x by doing:

x.requires_grad_()
y = f(x)
grad = torch.autograd.grad(y, x)[0]

Note that if y contains more than one element, you will need to provide a grad_output=v and it will compute the dot product between v and the Jacobian of f. See here for what the Jacobian is.

1 Like

that is great …Thank you so much… but can i set weight for these two tensors?

What do you mean by weight? Like how much each entry in y should contribute to the gradient? If so, then it is what v is doing in some sense.

i want to intilize same weight for every tensor and cmopute function between two tensors after set weight and then calculate the gradient…can i make that?

Sorry I don’t understand what you want here… If you could write down as pseudo code or math what you want that will help.