In Neural Language Model,what means "No symmetry in how the inputs are processed"?

I am learning stanford CS224n.In Neural Language Model Part. The first navie model is assuming window_size = 4,concatenate word embedding [e1,e2,e3,e4] then multiplied by a matrix W ---->hidden_layer---->output
It says one of its disadvantage is "e1 and e2 are multiplied by completely different weights in đť‘Š.No symmetry in how the inputs are processed."I am stuck in these questions.

  1. I am really confused about what means “multiplied by different weights” can some one demonstrate it with some dimensionality analysis?

  2. And lecturer says “you are kind of learning some similar functions many times”!.In my understanding, concatenate is very common in neural network last layer(like some fusion tasks).So is that means what i have been doing is not good?If not,what is difference between that lecturer says and our common use of Concatenate in network?

Really hope someone can help me