Kronecker Product

Is there a way to perform a Kronecker product between two matrices? I know there’s outer products for vectors, but is there something like that for 2D Tensors?

2 Likes

I’m afraid not. But you can probably replicate the result by replicating one of the tensors and reshaping the result.

Okay. I was going to use the “upsample” method to replicate this, but this will only work with square matrices. Is there a way to upsample with a different scale factor along the different dimensions?

How about this? I dont think this is efficient way, though.

h, w = t1.size()
c = t2.size()[0]
t2.repeat(h, w, 1) * t1.unsqueeze(2).repeat(1, 1, c)

That doesn’t seem to work (I just tried t1 = t2 = torch.randn(2, 2))

This should work for arbitrarily sized matrices, though I’m not sure how fast/slow it will be:

def kronecker_product(t1, t2):
    """
    Computes the Kronecker product between two tensors.
    See https://en.wikipedia.org/wiki/Kronecker_product
    """
    t1_height, t1_width = t1.size()
    t2_height, t2_width = t2.size()
    out_height = t1_height * t2_height
    out_width = t1_width * t2_width

    tiled_t2 = t2.repeat(t1_height, t1_width)
    expanded_t1 = (
        t1.unsqueeze(2)
          .unsqueeze(3)
          .repeat(1, t2_height, t2_width, 1)
          .view(out_height, out_width)
    )

    return expanded_t1 * tiled_t2
3 Likes

No. I supposed t2 is a vector, indeed.

Below seems to work.

def kronecker(matrix1, matrix2):
    return torch.ger(matrix1.view(-1), matrix2.view(-1)).reshape(*(matrix1.size() + matrix2.size())).permute([0, 2, 1, 3]).reshape(matrix1.size(0) * matrix2.size(0), matrix1.size(1) * matrix2.size(1))

2 Likes

This works for 2d tensorsof size [B,D] with the first dimension as the batch size[B] -
size_t1 is the D

def kronecker_product(t1, t2,size_t1,size_t2):
    fusion_tensor = torch.bmm(t1.unsqueeze(2), t2.unsqueeze(1))
    fusion_tensor = fusion_tensor.view(-1, size_t1 * size_t2)
    return fusion_tensor

The following github repo has implemented the Kronecker product of multiple tensors efficiently.
gpytorch/kronecker_product_lazy_tensor.py

1 Like

Another simple way:

def kronecker(A, B):
    return torch.einsum("ab,cd->acbd", A, B).view(A.size(0)*B.size(0),  A.size(1)*B.size(1))
4 Likes

I made a broadcasting-based implementation that works with batch dimensions: https://gist.github.com/yulkang/4a597bcc5e9ccf8c7291f8ecb776382d

1 Like

I guess this should be the top choice since it’s lazy, meaning it would calculate the elements just when you access them so less memory usage.

This way is so interesting.

Hi @SaiSurya, can you give an example on how to use KroneckerProductLazyTensor? Many thanks.