Using non-full backward hooks on a Module that does not return a single Tensor

I’m getting this warning since I upgraded to 1.8. I have a custom module that returns tensor, (tensor tensor) which triggers this.
What is this warning trying to say.

The warning is raised, if you are trying to use register_backward_hook, which is deprecated in favor of register_full_backward_hook. Could you swap these methods and rerun your script?

@ptrblck hmm, the thing is I’m not explicitly registering a backward hook anywhere in my module.

My module consists of a bunch of normal PyTorch modules/functions which have their own backward methods. So I don’t see the need to write my own backward method. Do I need to write one now?

That’s strange, as this warning would be called in these lines of code, which shouldn’t be triggered, if no backward hooks are used.
Could you post a minimal executable code snippet, which would raise this warning without the explicit usage of backward hooks, please?

Hi, I use this code snippet and trigger this warning,

    def hook_intermediate_layers(self):
        for name, m in self.backbone.named_modules():
            if name in self.return_layers:
                m.register_forward_hook(partial(self.copy_output, feat_id=self.return_layers[name]))

What should I do to eliminate the warning?

Could you post an executable code snippet to reproduce this issue, as I’m still unable to reproduce and debug it?

Please give me some time working on the reproduction.