Register_backward_hook on one layer in a dataparallel module

SOLVED: calling dataparallel on each individual layer seems to have solved this

Start of discussion is here: Cant save in in backward hook on multi GPU - 3rd post, please help

I want to save values during a backward_hook function for various layers in my network while running on multiple GPUs. It looks like to do this I need to call register_backward_hook after the call to dataParallel. Is there a way to call register_backward_hook with a specific layer after saying net = torch.nn.DataParallel(net…)? My only thought on doing this right now is calling dataparallel on each individual layer instead of on the whole network. If I try net.module.layerX.register_backward_hook the hook is not for the DataParallel and I run into the same problems as the original thread.