Replace resnet 50 fc with mlp

Hy guys, I must do “features sharing”. I must pass the features of resnet 50 (2048) to a MLP

from torchvsion.models import resnet50
class pipo(nn.module):
      def __init__(self,n_input, n_output, n_hidden):
        
            super(pipo, self).__init__()
            self.model = resnet50(pretrained = True)
            self.model.fc = nn.Linear(self.model.fc.in_features, n_input)
            self.fc = nn.Linear(self.model.fc, n_hidden)
            self.mu = nn.Linear(n_hidden, n_output)
     
     def forward(self, x):
            etc...        

is it right in this way?

This looks generally good.
It might be a typo, but self.fc should most likely be defined as:

self.fc = nn.Linear(self.model.fc.out_features, n_hidden)

Can you clarify me the difference between …fc.in_features() and fc.out_features()?

The in_features define the input features of your tensor for this linear layer, while out_features define the output features.
E.g. if you are working with 2 input features for each sample (let’s say height and weight for a dog classifier), you would define in_features=2 in your first linear layer. The number of output features depends on your architecture and you could chose any valid value, which “works”.
Have a look at CS231n to get an example of the weight matrix and the mentioned names.

1 Like

I have a doubt for FC layer of my resnet. I don`t know if is it right.
Can you help me?
Maybe do i open a new topic?

    def __init__(self, n_input, n_output, n_hidden1, n_hidden2):
        super(ActorCriticModel, self).__init__()
        self.model = resnet50(pretrained=True)
        out = self.model.fc.in_features
        self.model.fc = Identity() #identity is a class where the forward return x element
        self.fcN = nn.Linear(out, n_hidden1)
        self.fcV = nn.Linear(out, n_hidden2)
        #self.model.fc = torch.cat(self.fcN + self.fcV)
        self.muX = nn.Linear(n_hidden1, 1)
        self.muZ = nn.Linear(n_hidden1, 1)
        self.sigma = nn.Linear(n_hidden1, 1)
        self.muO1 = nn.Linear(n_hidden1, 1)
        self.muO2 = nn.Linear(n_hidden1, 1)
        self.sigmaO = nn.Linear(n_hidden1, 1)
        self.value = nn.Linear(n_hidden2, 1)
        self.distributionXZ = torch.distributions.MultivariateNormal
        self.distributionO = torch.distributions.MultivariateNormal

What doubts do you have about the approach?