Output from hidden layers

Hi guys,

I’m new to PyTorch and usally write my networks in tensorflow. I have some questions on to correctly do stuff in PyTorch.

Suppose I have a two-layer network called A(x):

class A(nn.Module):
    def __init__(self):
        super(A, self).__init__()
        self.fc1 = nn.Linear(100, 100)
        self.fc2 = nn.Linear(100, 10)

    def forward(self, x):
        x = self.fc1(x)
        x = F.relu(x)
        x = self.fc2(x)
        x = F.relu(x)
        return x

Now I need outputs from fc1 and fc2 before applying relu. What is the ‘PyTorch’ way of achieving this? I was thinking of writing something like this:

def hidden_outputs(self, x):
    outs = {}
    x = self.fc1(x)
    outs['fc1'] = x
    ...
    return outs

and then calling A.hidden_outputs(x) from another script. Also, is it okay to write any function in addition to forward in the class? Can I for example write:

def apply_softmax(self, x):
    x = self.forward(x)
    x = F.softmax(x)
    return x

and use the function above to calculate gradients etc. in another script?

...

net = A()
x = data   

out = A.softmax(x)
out.backward()
x_grad = x.grad

...
1 Like

Here’s how you would rewrite A(x).

class A(nn.Module):
    def __init__(self):
        super(A, self).__init__()
        self.fc1 = nn.Linear(100, 100)
        self.fc2 = nn.Linear(100, 10)

    def forward(self, x):
        fc1 = self.fc1(x)
        a = F.relu(fc1)

        fc2 = self.fc2(a)
        b = F.relu(fc2)
        return fc1, fc2, a, b

This would give you both the hidden outputs of FC1 and FC2, and also the RELU-applied outputs as well.

If you would like to re-use the weights for one of the FC’s in some other script as you have mentioned,

# Load model.
model = A()
model.load_state_dict(...)

# Re-use trained FC1 layer weights.
fc1_outputs = model.fc1(Variable(...))
fc1_outputs = F.softmax(fc1_outputs)

and so on.

Thanks. Your suggestion looks clean and I like the idea to just have one function in the class. I will just return a dictionary with all the outputs in the forward function:

def forward(x):
    outs = {}
    fc1 = self.fc1(x)
    outs['fc1'] = fc1
    relu1 = F.relu(fc1)
    outs['relu_fc1'] = relu1
    ...
    return outs