Export to onnx has warning messages

Trying to export the encoder model based on this project, since the onnx export function do works well with list output, I change the forward function of the encoder from

def forward(self, input_image):
        self.features = []
        x = (input_image - 0.45) / 0.225
        x = self.encoder.conv1(x)
        x = self.encoder.bn1(x)

        return self.features


def forward(self, input_image):        
        x = (input_image - 0.45) / 0.225
        x = self.encoder.conv1(x)
        x = self.encoder.bn1(x)		
        output1 = self.encoder.relu(x)
        output2 = self.encoder.layer1(self.encoder.maxpool(output1))
        output3 = self.encoder.layer2(output2)
        output4 = self.encoder.layer3(output3)
        output5 = self.encoder.layer4(output4)
        output_tensor = torch.ones(2887680)
        #guess I do not need to call clone?
        output_tensor[0:1966080] = output1.flatten()
        output_tensor[1966080:2457600] = output2.flatten()
        output_tensor[2457600:2703360] = output3.flatten()
        output_tensor[2703360:2826240] = output4.flatten()
        output_tensor[2826240:] = output5.flatten()

        return output_tensor

It show me warning

TracerWarning: There are 2 live references to the data region being modified when tracing in-place operator copy_ (possibly due to an assignment). This might cause the trace to be incorrect, because all other views that also reference this data will not reflect this change in the trace! On the other hand, if all other views use the same memory chunk, but are disjoint (e.g. are outputs of torch.split), this might still be safe.

Is this safe?

Verify with the output values, this don’t work. What is the proper way to copy the data of different layers into a big tensor if you want to export it to onnx model?

I found the solution, use cat.

return torch.cat((output1.flatten(), output2.flatten(), output3.flatten(), output4.flatten(), output5.flatten()))