Is it possible to check if all the layers are used during backprogration (basically to see if no layer was forgotten or if it’s output is not connected to the final node) ?
param.grad
is None when init, backward would make it a variable.
for name,param in model.named_params():
if param.grad is None:
print(name)
2 Likes