Computation of weight matrix for the residual block of a Residual Neural Network (ResNet)

I am trying to understand if I can “summarize” the operations in the residual block with a single weight matrix and bias.
For example, consider the structure of the photo and suppose all the operations can be written via a linear form (don’t consider the relu). Then I can compute the two matrices that tell me how I can go from the input the output of path a and of path b. Then the real output should, from the construction of residual networks, the sum (pointwise) of what path a and path b produce.
Then my intuition and some examples tell me that the matrix that tells me how the input changes when going through the residual block should be just the sum of the two for path a and path (i.e. every sum is done element wise).
I would then implement this matrix using torch.add(mat_path_a, mat_path_b).
Is this possible?
Thanks in advance