Saving and load model Intermediate Layer output the tensor object

I am trying to save the intermediate output, the Tensor object output of a pre-trained model layer to a file, and loading it back using pickle dump/load. when I convert to a dump and loading it back, the grad_fn object in the tensor is getting removed and replaced by requires_grad.

Original:
tensor([[[[ 7.3672e-01, 3.9696e-01, 8.4902e-01, …, -5.8079e-01,
4.0203e+00, 1.5609e+00],
…,
[ [-9.5835e-01, 5.4335e-01, 9.7559e-01, …, -2.1748e+00,
-3.4150e+00, -8.7405e+00]]]], grad_fn=)
Converted:
tensor([[[[ 7.3672e-01, 3.9696e-01, 8.4902e-01, …, -5.8079e-01,
4.0203e+00, 1.5609e+00],
…,
[-9.5835e-01, 5.4335e-01, 9.7559e-01, …, -2.1748e+00,
-3.4150e+00, -8.7405e+00]]]], requires_grad=True)

Is there a way I can avoid this? or is there a way to add grad_fn to a tensor object manually. For Ex: I want to know if there a way to set MkldnnConvolutionBackward as grad_fn when i can reconstruct the tensor object matrix from data of the orginal tensor object

Thank You for any assistance

No, I don’t think you can manually recreate the computation graph.
What’s your use case that you would like to serialize the tensor first and rebuild the graph later?

my use case is compute intermediate layer Output result and send to another server when another half of execution will be done something like client server environment. For example if an alexnet has 12 layers first machine will be having 6 layers running in it which will take the image input and we will get the intermediate output (tensor object) of it and then sent it through network and use it as input for next half of the model. Is there any way to do it? Even in my current scenario the numpy array is intact only the grad_fn is missing for future prediciton

This sounds like a distributed training setup. While the usual workflow would be to let each node compute a single forward and backward pass on its own, I think it should be possible to apply model sharding in this setup as well. Unfortunately, I don’t have an examplecode snippet ready for it and would recommend to look into the functional API of the distributed package.

Thank you for the information. I will try and see