How to define a function to construct a specific type of tensor with given shape

A tensor is called i-type if its entries are equal to the sum of index squared.
I want to have a function to construct an i-type tensor with arbitrarily given shape. This can be done with numpy as follows:

import numpy as np
def i_tensor(shape = [2,2,2]):
    t = np.zeros(shape)
    it = np.nditer(t, flags=['multi_index'])
    while not it.finished:
        ind = it.multi_index
        t[ind] = np.sum(np.array(ind)**2)
        it.iternext()
    return t

Test 1:

#test the function
> i_tensor([4,4])

> array([[ 0.,  1.,  4.,  9.],
        [ 1.,  2.,  5., 10.],
        [ 4.,  5.,  8., 13.],
        [ 9., 10., 13., 18.]])

Test2:

> i_tensor([2,2,3])
> array([[[0., 1., 4.],
        [1., 2., 5.]],

       [[1., 2., 5.],
        [2., 3., 6.]]])

Can I have the same function for torch tensor? I do not see tensor replacement of nditer.

The following should work (and be much faster as it only loops over the number of dimensions, not the whole output:


def i_tensor(shape = [2,2,2]):
    ndim = len(shape)
    view_shape = [1,] * ndim
    res = 0.
    for d in range(ndim):
        dim_size = shape[d]
        t = torch.arange(dim_size)
        view_shape[d] = dim_size
        t = t.view(*view_shape).expand(*shape)
        view_shape[d] = 1
        res += t

    return res
1 Like

Great answer, and I am trying to digest. I may be back here to ask more if I do not understand. Thanks albanD

Sure !
The trick is to create a Tensor that contains all the indices of one dimension, then add them element-wise.
The view / expand trick is used to reduce memory usage. By only having one of these Tensors to be the full size and all the other ones only the size of the given dimension.

1 Like

First, it is my apology to have a mistake in my original numpy code. Each entry of tensor shall be sum of index , but I did not square it. And your code may be needed a slight change, and I am trying to figure out how to make a square before taking the sum. I am very new to torch …:grinning: But I learned a lot tricks from this short code given by you.

Just got it, change one line will work
t = torch.arange(dim_size)**2

1 Like