How to operate in matrix consist of vectors in pytorch?

I have tensor like this:

arr1 = np.array([[ 1.6194, -0.6058, -0.8012], [ 1.1483,  1.6538, -0.8062]])
arr2 = np.array([[-0.3180, -1.8249,  0.0499], [-0.4184,  0.6495, -0.4911]])
X = torch.Tensor(arr1)
Y = torch.Tensor(arr2)

I want to do on every tensor 1D (2 vectors) inside my 2D tensor, Y)   

I want to get the result like this tensor([dotResult1, dotResult2]).
But I got the error like this:

RuntimeError: 1D tensors expected, but got 2D and 2D tensors

My main purpose is to do “something” operation on every vector inside my matrix but I don’t want to use looping here, does anyone know how to do that?

Hi Ellipsis …

This will depend on what “something” is. In general, you would want to
figure out how to do “something” using pytorch tensor operations that act
on your 2D tensors all at once.

For your particular example of computing the dot products of the vectors
in your 2D tensors you can use element-wise multiplication and then sum
the rows or, equivalently, use the general purpose einsum() to do the same

>>> import torch
>>> torch.__version__
>>> X = torch.tensor ([[ 1.6194, -0.6058, -0.8012], [ 1.1483,  1.6538, -0.8062]])
>>> Y = torch.tensor ([[-0.3180, -1.8249,  0.0499], [-0.4184,  0.6495, -0.4911]])
>>> resultA = (X * Y).sum (dim = 1)
>>> resultA
tensor([0.5506, 0.9896])
>>> resultB = torch.einsum ('ij, ij -> i', X, Y)
>>> torch.equal (resultA, resultB)

(As you’ve noticed, pytorch does not offer a specific “batch-vector-dot-product”
function – not that it needs one.)


K. Frank

1 Like

Thank you it’s work!