Torchvision deformable Conv Layer Memory leak

While using a network with normal nn.Conv2d, Everyrhing is fine.
Replacing it with torchvisions deformable layer, Cuda memory continuously bumps up every iteration.

Tried with mmcv’s Deformable pack too. Same happens.
What is the issue?

Could you post a minimal, executable code snippet which would reproduce this issue, please?

I had registered forward hooks to access offsets in deformable conv for visualization.
The hooks were unattended and causing the leak.

Thanks for response.
Issue is resolved.