How to split backward process wrt each layer of neural network?

Hi, @smth. Thanks a lot for this pointer.

I already tried what you suggest that call the .backward() on the output Variable that is output from the module. The case is, if I do the normal forward() manner like this way:

    def forward(self, x, target):
    	for sub_module in self.module_list_0:
    		x = sub_module(x)
    	x = x.view(-1, 4*4*50)
    	for sub_module in self.module_list_1:
    		x = sub_module(x)
    	loss = self.ceriation(x, target)
    	return x, loss

Then once I call the variable.backward() on the last variable, say loss.backward() then the whole backward process will be executed. But if I call anything like x = Variable(x.data, requires_grad=True) after each forward step as you mentioned, then it seems no grad will be calculated if I check param.grad in module.parameters().
After checking this topic Assign manual assigned "grad_output", it seems call variable.backward(grad_output) can be helpful, but when I calling loss.backward(grad_output) under normal forward manner, the behavior is nothing different from loss.backward().
What tricks do I need in the forward/backward process to achieve executing backward process layer by layer (e.g. get output from last layer and use it to do backward for next layer, like doing the chain rule manually )?

Thanks a lot!