How do I enforce two features to be symmetric?

Say we have two features x1 and x2.
I want the neural network to give the same results when I switch the two features.
A unstrict way would be argument the data by switching two columns. But this won’t guarantee symmetry.
This neural network could be multi-layers and there could be other features which I don’t want to enforce any constraints.

Edit: My guessing is I could generate a layer purely for the two features, e.g, a Linear(1,10), using this to act on both of them. And use another linear layer to act on other features. I could then concat or add the next layer. Give me suggestions if there are better practice. Thanks!

I agree, I think the best way to go about this is to create a linear layer to act on just these two features and create a separate linear layer for the remaining features and then concatenate the output.

This seems not working, the order I concatenate them still makes a difference.

What is the final output you want? It is dependent on that as well. If you apply the same linear layer to the two features, there would be symmetry if the ordering of the output did not matter. What is the output that you are expecting from your model?

After I concatenate them, they will be passed to a few more layers to yield an output. The two feature are not really symmetric in this case.

x1–a few layers–>f(x1)
x2–a few layers–>f(x2)
(f(x1),f(x2),x3,x4)—subsequent layers—>output

So when I switch x1 and x2, it becomes (f(x2),f(x1),x3,x4) which is different and will yield different final results.

Yes, if you have more linear layers after the initial(‘symmetric’) linear layer you will lose symmetry. You will likely have to constantly apply the same linear layer to the two features, and only concatenate the outputs at the absolute end.