# How to constrain one half of a trainable Tensor to be equal to the flipped first half?

Hello everybody,

I am looking for the following:

Assume that I have a trainable tensor T of shape torch.size().
Now I would like to train that tensor but also to have the following constraint:
the tensor sould be symmetric in the sense that
T = T
T = T,
T = T,
T = T.

i.e. the tensor T consists of actually two tensors, one of which is the flipped version of the other one.

I have tried to assign to the first half of T its flipped version, i.e.

self.T[4:] = torch.flip(self.T[:4],dims=)

but this fails as the optimizer tells me the it can’t optimize a non-leaf tensor.

Any ideas on how to make that work?

Thank you very much in advance!

Hi Kof!

I would not use a tensor of length 8 and attempt to constrain its first
half to mirror its second. Just use a tensor of length 4, and use each
of its 4 elements twice in your computations.

Here is an example:

``````>>> import torch
>>> torch.__version__
'1.7.1'
>>> T = torch.randn (4, requires_grad = True)
>>> v = torch.arange (8).float()
>>> torch.dot (torch.cat ((T, torch.flip (T, (0,)))), v).backward()
tensor([7., 7., 7., 7.])
``````

(The business with `v` and `torch.dot()` is just to give the example
something to run `.backward()` on to show that autograd works.)

Best.

K. Frank

1 Like

Hi Frank!

Workd like a charm!
Thanks a lot!