How to constrain one half of a trainable Tensor to be equal to the flipped first half?

Hello everybody,

I am looking for the following:

Assume that I have a trainable tensor T of shape torch.size([8]).
Now I would like to train that tensor but also to have the following constraint:
the tensor sould be symmetric in the sense that
T[0] = T[7]
T[1] = T[6],
T[2] = T[5],
T[3] = T[4].

i.e. the tensor T consists of actually two tensors, one of which is the flipped version of the other one.

I have tried to assign to the first half of T its flipped version, i.e.

self.T[4:] = torch.flip(self.T[:4],dims=[0])

but this fails as the optimizer tells me the it can’t optimize a non-leaf tensor.

Any ideas on how to make that work?

Thank you very much in advance!

Hi Kof!

I would not use a tensor of length 8 and attempt to constrain its first
half to mirror its second. Just use a tensor of length 4, and use each
of its 4 elements twice in your computations.

Here is an example:

>>> import torch
>>> torch.__version__
>>> T = torch.randn (4, requires_grad = True)
>>> v = torch.arange (8).float()
>>> ( ((T, torch.flip (T, (0,)))), v).backward()
>>> T.grad
tensor([7., 7., 7., 7.])

(The business with v and is just to give the example
something to run .backward() on to show that autograd works.)


K. Frank

1 Like

Hi Frank!

Workd like a charm!
Thanks a lot!