Correct way for parrallel blocks of convolutions

Here is my setup:
I have a signal x of dimension C x L, where C is number of channels, L is length of signal. My neural network has C different convolutional block each operating on a single channel of x. Now, I want to aggregate all the outputs of individual channels and form a new variable, say x1 and pass it as a signal to another neural network.

The problem is this, for aggregating I am using torch.cat (also experimented with torch.stack, same error persists). For example: x1 = torch.cat((x1, out), 0). The forward pass has no problems, but the error occurs when I try loss.backward(). The error I receive is

RuntimeError: function SetItemBackward returned a gradient different than None at position 3, but the corresponding forward input was not a Variable.

I am unable to figure out how to resolve this error. I can’t get where SetItemBackward is being used. Is there some other way that this should be done? What mistake am I making ? Any help would be appreciated. Thanks