How to save features of a layer and reuse them later?

Hello :slight_smile:

I want to use the pretrained models, such as resnet101, and get the features after the layer 4, and save them. I dont want to load a pretrained model because i dont have memory to do that

Then have another model that i want to use those saved features.

To clarify, the features that i want to save have size of 1x1024xWxH and when I reload them I don’t care if they have auto_grad to be True or False.

How can i save the features and then read them and use them in a new model? :exploding_head::woozy_face:

Why am I doing it? My second model is too big and already slow that i dont wanna add another thing on the top of it :man_shrugging:t2:

I think this question is the same as how to save and reuse the following tensor A:
A = torch.rand(1,1024,10,10)

Just do the following:

model = torchvision.models.resnet18(pretrained=True)
num_ftrs = model.fc.in_features
model.fc = DCL() #feed your class of the model

EDIT:
Sorry you want layers after layer 4:

model.state_dict()

And filter layers in dict that you need.

I cannot do that because I dont have memory to load the model :expressionless:

p.s. there is something wrong with what you wrote because the naming wont make sense, DCL shows up from nowhere

DCL() is your model. You can name it however you want.
You can create it using Sequential or create your own class that inherits from torch.nn.Module.

I see.
Thanks for your answer, but unfortunately, i dont have memory to do this.
I was wondering if there is a way to save and reload a tensor

That is why you need pretrained model and freeze layers. There is a good tutorial on https://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html

If you would like to store the activations, you could use forward hook as described in this example. Instead of the print statements use torch.save to save the activations using the dict.

Later, you can restore these activations using torch.load and use them as your training samples for the second model.

1 Like

:pray:t2::pray:t2::pray:t2:

1 Like