Hi,
I have a neural network and want to calculate the backward pass one layer at a time. Let’s say this is the network:
class DeepNN(nn.Module):
def __init__(self):
super(DeepNN, self).__init__()
self.fc_1 = nn.Linear(D_in, H)
self.fc = nn.Linear (H,H)
self.fc_end = nn.Linear(H,D_out)
self.module_list = []
self.module_list.append(self.fc_1)
for i in range(num_hidden):
self.module_list.append(nn.Linear (H,H))
self.layers = nn.ModuleList(self.module_list)
self.f = nn.Sequential(*self.module_list)
self._name = "DeepNN"
def forward (self, x):
return self.f(x)
Now, instead of loss.backward(), I want to calculate gradients layer by layer and change gradients of each layer and pass it to the other layer.
Is there a proper way to do that?