CPU memory Leakage in Custom Loss Function

Hi all,
I have a custom loss function in which another function X is called (which is calculation intensive). From the X function call I get an array of values that is then used to calculate loss. (Autograd is used).
There is a decrease in the CPU memory before and after the X function is called. I am using a GPU for computations.
Any idea why would there be a decrease in the CPU memory?

Thanks